NDK--利用Camera和AudioRecord實(shí)現(xiàn)直播推流

上次我們在Android Studio中新建了項(xiàng)目,集成了實(shí)現(xiàn)直播推流所需要的工具,分別是:
  • rtmpdump:推流
  • x264:視頻編碼
  • faac:音頻編碼
文章地址:NDK--Android Studio中直播推流框架的搭建
直播推流還需要流媒體服務(wù)器支持,我這邊使用的是虛擬機(jī),有條件的也可以使用真正的服務(wù)器,具體的流媒體服務(wù)器搭建方法可以參考我以前的文章:Nginx流媒體服務(wù)器搭建
基本工作完成后,今天我們來實(shí)現(xiàn)直播推流。
1.首先界面非常簡單,布局文件如下:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    tools:context=".MainActivity">

    <LinearLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:orientation="horizontal">

        <Button
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="startPush"
            android:text="開始推流" />

        <Button
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="stopPush"
            android:text="停止推流" />

        <Button
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:onClick="swtichCamera"
            android:text="切換攝像頭" />
    </LinearLayout>


    <SurfaceView
        android:id="@+id/sv_preview"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

</LinearLayout>
一個按鈕開始推流,一個按鈕停止推流,一個按鈕切換前后置攝像頭,一個SurfaceView顯示攝像頭畫面
2.定義好native方法,用于獲取Java的攝像頭視頻和錄音音頻數(shù)據(jù),以及音視頻參數(shù)
public class NativePush {
    private static final String TAG = NativePush.class.getSimpleName();

    static {
        System.loadLibrary("native-lib");
    }

    private LiveStateChangeListener mListener;

    public void setLiveStateChangeListener(LiveStateChangeListener listener) {
        mListener = listener;
    }

    /**
     * native層回調(diào)
     *
     * @param code -96:音頻編碼器設(shè)置失敗
     *             -97:音頻編碼器打開失敗
     *             -98:打開視頻編碼器失敗
     *             -99:建立rtmp連接失敗
     *             -100:rtmp斷開
     */
    public void onPostNativeError(int code) {
        Log.e(TAG, "onPostNativeError:" + code);
        //停止推流
        stopPush();
        Log.d("NativePush", code + "");
        if (null != mListener) {
            mListener.onErrorPusher(code);
        }
    }

    /**
     * native層回調(diào)
     * 推流連接建立和線程退出
     *
     * @param state
     */
    public void onPostNativeState(int state) {
        if (state == 100) {
            mListener.onStartPusher();
        } else if (state == 101) {
            mListener.onStopPusher();
        }
    }

    //設(shè)置視頻參數(shù)
    public native void setVideoParams(int width, int height, int bitrate, int fps);

    //設(shè)置音頻參數(shù)
    public native void setAudioParams(int sample, int channel);

    //推視頻幀
    public native void pushVideo(byte[] buffer);

    //推音頻幀
    public native void pushAudio(byte[] buffer, int size);

    //開始推流線程
    public native void startPush(String url);

    //停止推流
    public native void stopPush();

    //獲取音頻緩沖區(qū)大小
    public native int getInputSamples();
}
3.定義視頻和音頻的參數(shù)類,方便后期統(tǒng)一管理
package com.aruba.rtmppushapplication.push.params;

/**
 * 視頻參數(shù)
 * Created by aruba on 2021/1/12.
 */
public class VideoParams {
    //幀數(shù)
    private int fps;
    private int videoWidth;
    private int videoHeight;
    //碼率
    private int bitrate;
    private int cameraId;

    private VideoParams(int videoWidth, int videoHeight, int cameraId) {
        this.videoWidth = videoWidth;
        this.videoHeight = videoHeight;
        this.cameraId = cameraId;
    }

    public int getFps() {
        return fps;
    }

    public void setFps(int fps) {
        this.fps = fps;
    }

    public int getVideoWidth() {
        return videoWidth;
    }

    public void setVideoWidth(int videoWidth) {
        this.videoWidth = videoWidth;
    }

    public int getVideoHeight() {
        return videoHeight;
    }

    public void setVideoHeight(int videoHeight) {
        this.videoHeight = videoHeight;
    }

    public int getBitrate() {
        return bitrate;
    }

    public void setBitrate(int bitrate) {
        this.bitrate = bitrate;
    }

    public int getCameraId() {
        return cameraId;
    }

    public void setCameraId(int cameraId) {
        this.cameraId = cameraId;
    }

    public static class Builder {
        private int fps = 25;
        private int videoWidth = -1;
        private int videoHeight = -1;
        private int bitrate = 480000;
        private int cameraId = -1;

        public Builder fps(int fps) {
            this.fps = fps;
            return this;
        }

        public Builder videoSize(int videoWidth, int videoHeight) {
            this.videoHeight = videoHeight;
            this.videoWidth = videoWidth;
            return this;
        }

        public Builder bitrate(int bitrate) {
            this.bitrate = bitrate;
            return this;
        }

        public Builder cameraId(int cameraId) {
            this.cameraId = cameraId;
            return this;
        }

        public VideoParams build() {
            if (videoWidth == -1 || videoHeight == -1 || cameraId == -1) {
                throw new RuntimeException("videoWidth,videoHeight,cameraId must be config");
            }
            VideoParams videoParams = new VideoParams(videoWidth, videoHeight, cameraId);
            videoParams.setBitrate(bitrate);
            videoParams.setFps(fps);

            return videoParams;
        }
    }
}

視頻需要的參數(shù)為視頻寬高、fps、比特率、攝像頭id(前置還是后置)

package com.aruba.rtmppushapplication.push.params;

/**
 * 音頻參數(shù)
 * Created by aruba on 2021/1/12.
 */
public class AudioParams {
    //采樣率
    private int sampleRate;
    //聲道數(shù)
    private int channel;

    private AudioParams() {
    }

    public int getSampleRate() {
        return sampleRate;
    }

    public void setSampleRate(int sampleRate) {
        this.sampleRate = sampleRate;
    }

    public int getChannel() {
        return channel;
    }

    public void setChannel(int channel) {
        this.channel = channel;
    }

    public static class Builder {
        //采樣率
        private int sampleRate = 44100;
        //聲道數(shù)
        private int channel = 1;

        public Builder sampleRate(int sampleRate) {
            this.sampleRate = sampleRate;
            return this;
        }

        public Builder channel(int channel) {
            this.channel = channel;
            return this;
        }
        
        public AudioParams build(){
            AudioParams audioParams = new AudioParams();
            audioParams.setSampleRate(sampleRate);
            audioParams.setChannel(channel);
            
            return audioParams;
        }
    }
}

音頻的參數(shù)為采樣率和聲道數(shù),采樣位數(shù)我們統(tǒng)一使用16bit

4.定義統(tǒng)一接口,用于音視頻推流實(shí)現(xiàn)
package com.aruba.rtmppushapplication.push;

/**
 * Created by aruba on 2020/12/30.
 */
public interface IPush {
    /**
     * 初始化
     */
    void init();

    /**
     * 開始推流
     */
    int startPush();

    /**
     * 停止推流
     */
    void stopPush();
}

5.定義統(tǒng)一管理類,用于管理音視頻推流
package com.aruba.rtmppushapplication.push;

import android.app.Activity;
import android.hardware.Camera;
import android.view.SurfaceHolder;

import com.aruba.rtmppushapplication.push.natives.LiveStateChangeListener;
import com.aruba.rtmppushapplication.push.natives.NativePush;
import com.aruba.rtmppushapplication.push.params.AudioParams;
import com.aruba.rtmppushapplication.push.params.VideoParams;

import java.lang.ref.WeakReference;

/**
 * 直播推流工具類
 * Created by aruba on 2021/1/12.
 */
public class PushHelper {
    //顯示攝像頭畫面的surface
    private SurfaceHolder surfaceHolder;
    //音頻推流
    private AudioPush audioPush;
    //視頻推流
    private VideoPush videoPush;
    private WeakReference<Activity> activity;
    //native層對象
    private NativePush nativePush;

    public PushHelper(Activity activity, SurfaceHolder surfaceHolder) {
        this.activity = new WeakReference<>(activity);
        this.surfaceHolder = surfaceHolder;
        init();
    }

    /**
     * 初始化
     */
    private void init() {
        nativePush = new NativePush();
        //設(shè)置回調(diào)
        nativePush.setLiveStateChangeListener(new LiveStateChangeListener() {
            @Override
            public void onErrorPusher(int code) {
                videoPush.stopPush();
                audioPush.stopPush();
            }

            @Override
            public void onStartPusher() {
                //等待rtmp連接開啟后,再開始推視頻和音頻
                videoPush.startPush();
                audioPush.startPush();
            }

            @Override
            public void onStopPusher() {
                videoPush.stopPush();
                audioPush.stopPush();
            }
        });
        
        //初始化視頻參數(shù)
        VideoParams videoParams = new VideoParams.Builder()
                .videoSize(1920, 1080)
                .bitrate(960000)
                .cameraId(Camera.CameraInfo.CAMERA_FACING_BACK)
                .build();
        videoPush = new VideoPush(activity.get(), videoParams, surfaceHolder);
        videoPush.setNativePush(nativePush);
        
        //初始化音頻參數(shù)
        AudioParams audioParams = new AudioParams.Builder()
                .channel(1)
                .sampleRate(44100)
                .build();
        audioPush = new AudioPush(audioParams);
        audioPush.setNativePush(nativePush);

        videoPush.init();
        audioPush.init();
    }

    /**
     * 開始推流
     *
     * @param url 服務(wù)器地址
     */
    public void startPush(String url) {
        nativePush.startPush(url);
    }

    /**
     * 停止推流
     */
    public void stopPush() {
        nativePush.stopPush();
    }

    /**
     * 切換攝像頭
     */
    public void swtichCamera() {
        if (videoPush != null)
            videoPush.swtichCamera();
    }
}

到目前為止,基本框架已經(jīng)構(gòu)建好,接下來來分別獲取攝像頭數(shù)據(jù)和麥克風(fēng)數(shù)據(jù),并將數(shù)據(jù)傳入native層
1.攝像頭數(shù)據(jù)獲取,并傳入native層
package com.aruba.rtmppushapplication.push;

import android.app.Activity;
import android.graphics.ImageFormat;
import android.hardware.Camera;
import android.util.Log;
import android.view.Surface;
import android.view.SurfaceHolder;

import com.aruba.rtmppushapplication.push.natives.NativePush;
import com.aruba.rtmppushapplication.push.params.VideoParams;

import java.io.IOException;
import java.lang.ref.WeakReference;
import java.util.Iterator;
import java.util.List;

/**
 * 對應(yīng)視頻推流的native層
 * Created by aruba on 2021/1/12.
 */
public class VideoPush implements IPush, Camera.PreviewCallback {
    private final static String TAG = VideoPush.class.getSimpleName();
    private VideoParams videoParams;
    //攝像頭
    private Camera camera;
    //顯示攝像頭數(shù)據(jù)
    private SurfaceHolder surfaceHolder;
    //攝像頭畫面數(shù)據(jù)緩沖區(qū)
    private byte[] buffers;
    private boolean isSurfaceCreate;
    private NativePush nativePush;
    private WeakReference<Activity> mActivity;
    private int screen;

    private byte[] raw;
    private final static int SCREEN_PORTRAIT = 0;
    private final static int SCREEN_LANDSCAPE_LEFT = 90;
    private final static int SCREEN_LANDSCAPE_RIGHT = 270;
    private boolean isPushing;

    public VideoPush(Activity activity, VideoParams videoParams, SurfaceHolder surfaceHolder) {
        this.mActivity = new WeakReference<>(activity);
        this.videoParams = videoParams;
        this.surfaceHolder = surfaceHolder;
    }

    @Override
    public void init() {
        if (videoParams == null) {
            throw new NullPointerException("videoParams is null");
        }
        surfaceHolder.addCallback(new SurfaceHolder.Callback() {
            @Override
            public void surfaceCreated(SurfaceHolder surfaceHolder) {
                isSurfaceCreate = true;
                resetPreview(surfaceHolder);
            }

            @Override
            public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
//                stopPreview();
//                startPreview();
            }

            @Override
            public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
                isSurfaceCreate = false;
            }
        });
    }

    /**
     * 開始預(yù)覽
     */
    private synchronized void startPreview() {
        try {
            camera = Camera.open(videoParams.getCameraId());
            Camera.Parameters parameters = camera.getParameters();
            parameters.setPreviewFormat(ImageFormat.NV21);//yuv
            setPreviewSize(parameters);
            setPreviewOrientation(parameters);
//            parameters.setPreviewSize(videoParams.getVideoWidth(), videoParams.getVideoHeight());
            camera.setParameters(parameters);
            if (isSurfaceCreate)
                camera.setPreviewDisplay(surfaceHolder);

            //創(chuàng)建緩沖區(qū)  長 * 寬 * 一像素所占用字節(jié)
            int bytePerPixel = ImageFormat.getBitsPerPixel(ImageFormat.NV21);
            buffers = new byte[videoParams.getVideoWidth() * videoParams.getVideoHeight()
                    * bytePerPixel];
            raw = new byte[videoParams.getVideoWidth() * videoParams.getVideoHeight()
                    * bytePerPixel];
            camera.addCallbackBuffer(buffers);
            camera.setPreviewCallbackWithBuffer(this);
            camera.startPreview();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void onPreviewFrame(byte[] bytes, Camera camera) {
        if (isPushing) {
            switch (screen) {//根據(jù)屏幕位置,旋轉(zhuǎn)像素?cái)?shù)據(jù)
                case SCREEN_PORTRAIT://豎屏
                    portraitData2Raw(buffers);
                    break;
                case SCREEN_LANDSCAPE_LEFT:
                    raw = buffers;
                    break;
                case SCREEN_LANDSCAPE_RIGHT:// 橫屏 頭部在右邊
                    landscapeData2Raw(buffers);
                    break;
            }

            if (camera != null) {
                //每次必須再調(diào)用該方法,不然onPreviewFrame只會回調(diào)一次 注:bytes就是buffers
                camera.addCallbackBuffer(bytes);
            }

            nativePush.pushVideo(raw);
        } else {
            stopPreview();
        }
//        Log.i(TAG, "獲取到了視頻數(shù)據(jù)");
    }

    private synchronized void resetPreview(SurfaceHolder surfaceHolder) {
        if (camera != null) {
            try {
                camera.setPreviewDisplay(surfaceHolder);
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }

    private synchronized void stopPreview() {
        if (camera != null) {
            camera.stopPreview();
            camera.release();
            camera = null;
        }
    }

    @Override
    public int startPush() {
        synchronized (TAG) {
            if (isPushing) {
                return -1;
            }
            isPushing = true;
        }
        startPreview();
        return 0;
    }

    @Override
    public void stopPush() {
        synchronized (TAG) {
            isPushing = false;
        }
    }

    public void swtichCamera() {
        if (videoParams.getCameraId() == Camera.CameraInfo.CAMERA_FACING_BACK) {
            videoParams.setCameraId(Camera.CameraInfo.CAMERA_FACING_FRONT);
        } else {
            videoParams.setCameraId(Camera.CameraInfo.CAMERA_FACING_BACK);
        }

        stopPreview();
        startPreview();
    }

    public void setVideoParams(VideoParams videoParams) {
        this.videoParams = videoParams;
    }

    public VideoParams getVideoParams() {
        return videoParams;
    }

    public void setNativePush(NativePush nativePush) {
        this.nativePush = nativePush;
    }

    /**
     * 獲取攝像頭支持的分辨率,并設(shè)置最佳分辨率
     *
     * @param parameters
     */
    private void setPreviewSize(Camera.Parameters parameters) {
        List<Integer> supportedPreviewFormats = parameters.getSupportedPreviewFormats();
        for (Integer integer : supportedPreviewFormats) {
            System.out.println("支持:" + integer);
        }
        List<Camera.Size> supportedPreviewSizes = parameters.getSupportedPreviewSizes();
        Camera.Size size = supportedPreviewSizes.get(0);
        Log.d(TAG, "支持 " + size.width + "x" + size.height);
        int m = Math.abs(size.height * size.width - videoParams.getVideoHeight() * videoParams.getVideoWidth());
        supportedPreviewSizes.remove(0);
        Iterator<Camera.Size> iterator = supportedPreviewSizes.iterator();
        while (iterator.hasNext()) {
            Camera.Size next = iterator.next();
            Log.d(TAG, "支持 " + next.width + "x" + next.height);
            int n = Math.abs(next.height * next.width - videoParams.getVideoHeight() * videoParams.getVideoWidth());
            if (n < m) {
                m = n;
                size = next;
            }
        }
        videoParams.setVideoHeight(size.height);
        videoParams.setVideoWidth(size.width);
        parameters.setPreviewSize(videoParams.getVideoWidth(), videoParams.getVideoHeight());
        Log.d(TAG, "預(yù)覽分辨率 width:" + size.width + " height:" + size.height);
    }

    /**
     * 獲取屏幕方向
     *
     * @param parameters
     */
    private void setPreviewOrientation(Camera.Parameters parameters) {
        if (mActivity.get() == null) return;

        Camera.CameraInfo info = new Camera.CameraInfo();
        Camera.getCameraInfo(videoParams.getCameraId(), info);
        int rotation = mActivity.get().getWindowManager().getDefaultDisplay().getRotation();
        screen = 0;
        switch (rotation) {
            case Surface.ROTATION_0:
                screen = SCREEN_PORTRAIT;
                nativePush.setVideoParams(videoParams.getVideoHeight(), videoParams.getVideoWidth(), videoParams.getBitrate(), videoParams.getFps());
                break;
            case Surface.ROTATION_90: // 橫屏 左邊是頭部(home鍵在右邊)
                screen = SCREEN_LANDSCAPE_LEFT;
                nativePush.setVideoParams(videoParams.getVideoWidth(), videoParams.getVideoHeight(), videoParams.getBitrate(), videoParams.getFps());
                break;
            case Surface.ROTATION_180:
                screen = 180;
                break;
            case Surface.ROTATION_270:// 橫屏 頭部在右邊
                screen = SCREEN_LANDSCAPE_RIGHT;
                nativePush.setVideoParams(videoParams.getVideoWidth(), videoParams.getVideoHeight(), videoParams.getBitrate(), videoParams.getFps());
                break;
        }
        int result;
        if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
            result = (info.orientation + screen) % 360;
            result = (360 - result) % 360; // compensate the mirror
        } else { // back-facing
            result = (info.orientation - screen + 360) % 360;
        }
        camera.setDisplayOrientation(result);
    }

    private void landscapeData2Raw(byte[] data) {
        int width = videoParams.getVideoWidth();
        int height = videoParams.getVideoHeight();
        int y_len = width * height;
        int k = 0;
        // y數(shù)據(jù)倒敘插入raw中
        for (int i = y_len - 1; i > -1; i--) {
            raw[k] = data[i];
            k++;
        }
        // v1 u1 v2 u2
        // v3 u3 v4 u4
        // 需要轉(zhuǎn)換為:
        // v4 u4 v3 u3
        // v2 u2 v1 u1
        int maxpos = data.length - 1;
        int uv_len = y_len >> 2; // 4:1:1
        for (int i = 0; i < uv_len; i++) {
            int pos = i << 1;
            raw[y_len + i * 2] = data[maxpos - pos - 1];
            raw[y_len + i * 2 + 1] = data[maxpos - pos];
        }
    }

    private void portraitData2Raw(byte[] data) {
        // if (mContext.getResources().getConfiguration().orientation !=
        // Configuration.ORIENTATION_PORTRAIT) {
        // raw = data;
        // return;
        // }
        int width = videoParams.getVideoWidth(), height = videoParams.getVideoHeight();
        int y_len = width * height;
        int uvHeight = height >> 1; // uv數(shù)據(jù)高為y數(shù)據(jù)高的一半
        int k = 0;
        if (videoParams.getCameraId() == Camera.CameraInfo.CAMERA_FACING_BACK) {
            for (int j = 0; j < width; j++) {
                for (int i = height - 1; i >= 0; i--) {
                    raw[k++] = data[width * i + j];
                }
            }
            for (int j = 0; j < width; j += 2) {
                for (int i = uvHeight - 1; i >= 0; i--) {
                    raw[k++] = data[y_len + width * i + j];
                    raw[k++] = data[y_len + width * i + j + 1];
                }
            }
        } else {
            for (int i = 0; i < width; i++) {
                int nPos = width - 1;
                for (int j = 0; j < height; j++) {
                    raw[k] = data[nPos - i];
                    k++;
                    nPos += width;
                }
            }
            for (int i = 0; i < width; i += 2) {
                int nPos = y_len + width - 1;
                for (int j = 0; j < uvHeight; j++) {
                    raw[k] = data[nPos - i - 1];
                    raw[k + 1] = data[nPos - i];
                    k += 2;
                    nPos += width;
                }
            }
        }
    }
}

當(dāng)rtmp連接建立后,native層會回調(diào)方法:onPostNativeState,最終在PushHelper中調(diào)用VideoPush的startPush方法,該方法開啟攝像頭預(yù)覽,將參數(shù)傳給native層,并會不斷調(diào)用onPreviewFrame方法將攝像頭數(shù)據(jù)傳遞給native層。

需要注意的是:安卓手機(jī)攝像頭的特殊性,需要我們根據(jù)屏幕方向來對攝像頭數(shù)據(jù)進(jìn)行旋轉(zhuǎn)
2.音頻數(shù)據(jù)獲取,并傳入native層
package com.aruba.rtmppushapplication.push;

import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;

import com.aruba.rtmppushapplication.push.natives.NativePush;
import com.aruba.rtmppushapplication.push.params.AudioParams;

/**
 * 對應(yīng)音頻推流的native層
 * Created by aruba on 2021/1/12.
 */
public class AudioPush implements IPush {
    private final static String tag = AudioPush.class.getSimpleName();
    private AudioParams audioParams;
    //錄音
    private AudioRecord audioRecord;
    private int bufferSize;
    private RecordThread recordThread;
    private NativePush nativePush;

    public AudioPush(AudioParams audioParams) {
        this.audioParams = audioParams;
    }

    @Override
    public void init() {
        if (audioParams == null) {
            throw new NullPointerException("audioParams is null");
        }

        int channel = audioParams.getChannel() == 1 ?
                AudioFormat.CHANNEL_IN_MONO : AudioFormat.CHANNEL_IN_STEREO;
        //最小緩沖區(qū)大小
        bufferSize = AudioRecord.getMinBufferSize(audioParams.getSampleRate(),
                channel, AudioFormat.ENCODING_PCM_16BIT);
        audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,//麥克風(fēng)
                audioParams.getSampleRate(),
                channel,
                AudioFormat.ENCODING_PCM_16BIT,
                bufferSize
        );
    }

    @Override
    public int startPush() {
        if (recordThread != null && recordThread.isPushing) {
            return -1;
        }
        stopRecord();
        recordThread = new RecordThread();
        recordThread.start();

        return 0;
    }

    @Override
    public void stopPush() {
        stopRecord();
    }

    private synchronized void stopRecord() {
        if (recordThread != null) {
            recordThread.isPushing = false;
        }
    }

    public void setAudioParams(AudioParams audioParams) {
        this.audioParams = audioParams;
    }

    public AudioParams getAudioParams() {
        return audioParams;
    }

    public void setNativePush(NativePush nativePush) {
        this.nativePush = nativePush;
    }

    class RecordThread extends Thread {
        private boolean isPushing = true;

        @Override
        public void run() {
            audioRecord.startRecording();

            nativePush.setAudioParams(audioParams.getSampleRate(), audioRecord.getChannelCount());
            while (isPushing) {
                //采樣數(shù) *  2字節(jié)(16bit:一個采樣占的比特?cái)?shù))
                byte[] buffer = new byte[nativePush.getInputSamples() * 2];
                int len = audioRecord.read(buffer, 0, buffer.length);
                if (len > 0) {
                    //交由native層處理
//                    Log.i(tag, "獲取到了音頻數(shù)據(jù)");
                    nativePush.pushAudio(buffer, len);
                }
            }

            audioRecord.stop();
        }
    }
}

初始化AudioRecord后,需要開啟一個線程,不斷讀取數(shù)據(jù),并傳入native層

注意:一次可以讀取的數(shù)據(jù)大小需要通過faac編譯器獲取,并不能直接使用初始化AudioRecord時的bufferSize
Java層代碼到此已經(jīng)完成了,接下來是重頭戲:native層代碼編寫。
1.在Java層我們第一步是調(diào)用native方法開啟推流線程:
pthread_t *pid;
pthread_mutex_t mutex;
pthread_cond_t cond;

//開始推流的時間
uint32_t start_time;
//推流地址
char *path;

//回調(diào)java
JavaVM *jvm;
jobject jPublisherObj;


JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM *vm, void *reserved) {
    jvm = vm;
    JNIEnv *env = NULL;
    jint result = -1;
    if (jvm) {
        LOGD("jvm init success");
    }
    if (vm->GetEnv((void **) &env, JNI_VERSION_1_4) != JNI_OK) {
        return result;
    }
    return JNI_VERSION_1_4;
}

/**
 * 調(diào)用java方法
 * @param env 
 * @param methodId 
 * @param code 
 */
void throwNativeInfo(JNIEnv *env, jmethodID methodId, int code) {
    if (env && methodId && jPublisherObj) {
        env->CallVoidMethodA(jPublisherObj, methodId, (jvalue *) &code);
    }
}

/**
 * 開始推流線程
 */
extern "C"
JNIEXPORT void JNICALL
Java_com_aruba_rtmppushapplication_push_natives_NativePush_startPush(JNIEnv *env, jobject instance,
                                                                     jstring url_) {
    if (isPublishing)//線程在運(yùn)行
        return;

    if (!jPublisherObj) {
        jPublisherObj = env->NewGlobalRef(instance);
    }

    LOGE("開始推流");
    pthread_t id;
    pthread_t *pid = &id;
    const char *url = env->GetStringUTFChars(url_, 0);

    //存放url路徑
    int url_len = strlen(url) + 1;
    path = (char *) (malloc(url_len));
    memset(path, 0, url_len);
    memcpy(path, url, url_len - 1);

    pthread_cond_init(&cond, NULL);
    pthread_mutex_init(&mutex, NULL);
    start_time = RTMP_GetTime();
    pthread_create(pid, NULL, startPush, NULL);

    env->ReleaseStringUTFChars(url_, url);
}
2.編寫線程執(zhí)行的代碼,開啟rtmp連接
bool isPublishing = false;

/**
 * 推流線程 
 * @param arg 
 * @return 
 */
void *startPush(void *arg) {
    pthread_mutex_lock(&mutex);
    isPublishing = true;
    pthread_mutex_unlock(&mutex);

    JNIEnv *env;
    jvm->AttachCurrentThread(&env, 0);
    jclass clazz = env->GetObjectClass(jPublisherObj);
    jmethodID errorId = env->GetMethodID(clazz, "onPostNativeError", "(I)V");
    jmethodID stateId = env->GetMethodID(clazz, "onPostNativeState", "(I)V");

    //rtmp連接
    RTMP *connect = RTMP_Alloc();
    RTMP_Init(connect);
    connect->Link.timeout = 5;//超時時間
    RTMP_SetupURL(connect, path);//設(shè)置地址
    RTMP_EnableWrite(connect);
    if (!RTMP_Connect(connect, NULL)) {//建立socket
        //建立失敗
        LOGE("建立rtmp連接失敗");
        //回調(diào)java層
        throwNativeInfo(env, errorId, -99);
        pthread_mutex_lock(&mutex);

        isPublishing = false;
        RTMP_Close(connect);
        RTMP_Free(connect);
        free(path);
        path = NULL;

        pthread_mutex_unlock(&mutex);
        release(env);

        jvm->DetachCurrentThread();
        pthread_exit(0);
    }
    RTMP_ConnectStream(connect, 0);//連接流
    LOGE("推流連接建立");
    throwNativeInfo(env, stateId, 100);

    while (isPublishing) {
        RTMPPacket *packet = get();
        if (packet == NULL) {
            continue;
        }

        //推流
        packet->m_nInfoField2 = connect->m_stream_id;
        int ret = RTMP_SendPacket(connect, packet, 1);//1:使用rtmp本身的上傳隊(duì)列
        if (!ret) {
            LOGE("rtmp斷開");
            throwNativeInfo(env, errorId, -100);
        }

        RTMPPacket_Free(packet);
        free(packet);
    }

    LOGE("結(jié)束推流");
    //釋放
    RTMP_Close(connect);
    RTMP_Free(connect);
    free(path);
    path = NULL;
    throwNativeInfo(env, stateId, 101);
    release(env);
    jvm->DetachCurrentThread();
    pthread_exit(0);
}
3.編寫生產(chǎn)者消費(fèi)者模式,線程中使用生產(chǎn)者消費(fèi)者模式進(jìn)行線程同步,取出數(shù)據(jù)并推流,RTMPPacket就是封裝好的編碼過后的數(shù)據(jù)(音視頻數(shù)據(jù)經(jīng)過x264、faac編碼壓縮后,還需要封裝成rtmp可識別的數(shù)據(jù),實(shí)際上就是一個組包的過程,后面會詳細(xì)介紹如何將x264、faac編碼的數(shù)據(jù)封裝成RTMPPacket)
//RTMPPacket隊(duì)列
std::queue<RTMPPacket *> queue;

//生產(chǎn)者
void put(RTMPPacket *pPacket) {
    pthread_mutex_lock(&mutex);
    if (isPublishing) {
        queue.push(pPacket);
    }

    pthread_cond_signal(&cond);
    pthread_mutex_unlock(&mutex);
}

//消費(fèi)者
RTMPPacket *get() {
    pthread_mutex_lock(&mutex);
    if (queue.empty()) {
        pthread_cond_wait(&cond, &mutex);
    }

    RTMPPacket *packet = NULL;
    if (!queue.empty()) {
        packet = queue.front();
        queue.pop();
    }

    pthread_mutex_unlock(&mutex);

    return packet;
}
4.設(shè)置音視頻參數(shù),初始化緩沖區(qū)
//y u v 分別所占字節(jié)
int y_len, u_len, v_len;
//裸數(shù)據(jù)
x264_picture_t *pic;
//編碼后的數(shù)據(jù)
x264_picture_t *pic_out;
//編碼器
x264_t *encoder;

extern "C"
JNIEXPORT void JNICALL
Java_com_aruba_rtmppushapplication_push_natives_NativePush_setVideoParams(JNIEnv *env,
                                                                          jobject instance,
                                                                          jint width, jint height,
                                                                          jint bitrate, jint fps) {
    if (pic != NULL) {
        x264_picture_clean(pic);
        free(pic);
        free(pic_out);
        pic = NULL;
        pic_out = NULL;
    }
    y_len = width * height;
    u_len = y_len / 4;
    v_len = u_len;

    //設(shè)置參數(shù)
    x264_param_t param;
    //    zerolatency預(yù)設(shè)以下內(nèi)容
    //            param->rc.i_lookahead = 0;
    //            param->i_sync_lookahead = 0;
    //            param->i_bframe = 0;
    //            param->b_sliced_threads = 1;
    //            param->b_vfr_input = 0;
    //            param->rc.b_mb_tree = 0;
    x264_param_default_preset(&param, x264_preset_names[0], "zerolatency");
    //設(shè)置支持的分辨率,默認(rèn)就是51
    param.i_level_idc = 51;
    //推流的格式
    param.i_csp = X264_CSP_I420;
    //視頻寬高
    param.i_width = width;
    param.i_height = height;
    param.i_threads = 1;

    //1秒多少幀
    param.i_timebase_num = fps;
    param.i_timebase_den = 1;
    param.i_fps_num = fps;
    param.i_fps_den = 1;
    //關(guān)鍵幀最大間隔時間的幀率
    param.i_keyint_max = fps * 2;

    //ABR:平均碼率  CQP:恒定質(zhì)量  CRF:恒定碼率 
    param.rc.i_rc_method = X264_RC_ABR;
    //碼率
    param.rc.i_bitrate = bitrate / 1000;
    //最大碼率
    param.rc.i_vbv_max_bitrate = bitrate / 1000 * 1.2;
    //緩沖區(qū)大小
    param.rc.i_vbv_buffer_size = bitrate / 1000;

    //0:別的客戶端使用pts做同步 1:推流端計(jì)算timebase做同步
    param.b_vfr_input = 0;
    //使用sps pps
    param.b_repeat_headers = 1;
    //碼流級別,baseline只提供i和p幀,降低延遲,提供很好的兼容性
    x264_param_apply_profile(&param, "baseline");

    //獲取解碼器
    encoder = x264_encoder_open(&param);
    if (!encoder) {
        LOGE("打開視頻編碼器失敗");
        jmethodID errorId = env->GetMethodID(env->GetObjectClass(instance), "onPostNativeError",
                                             "(I)V");
        throwNativeInfo(env, errorId, -98);
        return;
    }

    pic = (x264_picture_t *) (malloc(sizeof(x264_picture_t)));
    //調(diào)用內(nèi)置函數(shù)初始化pic,pic存放yuv420數(shù)據(jù)
    x264_picture_alloc(pic, X264_CSP_I420, width, height);
    pic_out = (x264_picture_t *) (malloc(sizeof(x264_picture_t)));
    LOGE("視頻編碼器打開完成");
}
//音頻編碼器
faacEncHandle handle;
//音頻緩沖區(qū)
unsigned long inputSamples;
//緩沖區(qū)最大字節(jié)數(shù)
unsigned long maxOutputBytes;

extern "C"
JNIEXPORT void JNICALL
Java_com_aruba_rtmppushapplication_push_natives_NativePush_setAudioParams(JNIEnv *env,
                                                                          jobject instance,
                                                                          jint sample,
                                                                          jint channel) {
    handle = faacEncOpen(sample, channel, &inputSamples, &maxOutputBytes);
    if (!handle) {
        LOGE("音頻編碼器打開失敗");
        jmethodID errorId = env->GetMethodID(env->GetObjectClass(instance), "onPostNativeError",
                                             "(I)V");
        throwNativeInfo(env, errorId, -97);
        return;
    }

    //配置
    faacEncConfigurationPtr config = faacEncGetCurrentConfiguration(handle);
    config->mpegVersion = MPEG4;
    config->allowMidside = 1;//中等壓縮
    config->aacObjectType = LOW;//音質(zhì)
    config->outputFormat = 0;//輸出格式
    config->useTns = 1;//消除爆破聲
    config->useLfe = 0;
    config->inputFormat = FAAC_INPUT_16BIT;
    config->quantqual = 100;
    config->bandWidth = 0; //頻寬
    config->shortctl = SHORTCTL_NORMAL;//編碼方式

    int ret = faacEncSetConfiguration(handle, config);
    if (!ret) {
        LOGE("音頻編碼器設(shè)置失敗");
        jmethodID errorId = env->GetMethodID(env->GetObjectClass(instance), "onPostNativeError",
                                             "(I)V");
        throwNativeInfo(env, errorId, -96);
        return;
    }

    LOGE("音頻編碼器設(shè)置成功");
}

視頻編碼器和音頻編碼器的設(shè)置看看注釋就行了,畢竟不是專業(yè)人員,這邊不作過多描述,感興趣的同學(xué)可以網(wǎng)上查下資料

5.編寫編碼代碼,編碼Java層傳遞的音視頻裸數(shù)據(jù)
/**
 * 編碼視頻
 */
extern "C"
JNIEXPORT void JNICALL
Java_com_aruba_rtmppushapplication_push_natives_NativePush_pushVideo(JNIEnv *env, jobject instance,
                                                                     jbyteArray buffer_) {
    if (!isPublishing || !encoder || !pic) {
        return;
    }
    jbyte *buffer = env->GetByteArrayElements(buffer_, NULL);

    uint8_t *u = pic->img.plane[1];
    uint8_t *v = pic->img.plane[2];
    //將nv21轉(zhuǎn)換為yuv420
    for (int i = 0; i < u_len; i++) {
        *(u + i) = *(buffer + y_len + i * 2 + 1);
        *(v + i) = *(buffer + y_len + i * 2);
    }
    memcpy(pic->img.plane[0], buffer, y_len);
//    pic->img.plane[0] = buffer;

    //nalu
    x264_nal_t *nal = 0;
    //nalu數(shù)量
    int pi_nal;
    int ret = x264_encoder_encode(encoder, &nal, &pi_nal, pic, pic_out);
    if (ret < 0) {
        env->ReleaseByteArrayElements(buffer_, buffer, 0);
        LOGE("編碼失敗");
        return;
    }

    //解包,將獲取的有效數(shù)據(jù)交由rtmp編碼
    unsigned char sps[100];
    unsigned char pps[100];
    int sps_len = 0;
    int pps_len = 0;

    for (int i = 0; i < pi_nal; i++) {
        if (nal[i].i_type == NAL_SPS) {//序列參數(shù)集
            //去除分隔符(占4個字節(jié))
            sps_len = nal[i].i_payload - 4;
            //獲取到有效數(shù)據(jù)
            memcpy(sps, nal[i].p_payload + 4, sps_len);
        } else if (nal[i].i_type == NAL_PPS) {//圖像參數(shù)集
            pps_len = nal[i].i_payload - 4;
            memcpy(pps, nal[i].p_payload + 4, pps_len);

            //sps和pps都獲取到后,發(fā)送頭信息
            send_264_header(sps, pps, sps_len, pps_len);
        } else {//發(fā)送關(guān)鍵幀和非關(guān)鍵幀
            send_264_body(nal[i].p_payload, nal[i].i_payload);
        }
    }

    env->ReleaseByteArrayElements(buffer_, buffer, 0);
}

/**
 * 編碼音頻
 */
extern "C"
JNIEXPORT void JNICALL
Java_com_aruba_rtmppushapplication_push_natives_NativePush_pushAudio(JNIEnv *env, jobject instance,
                                                                     jbyteArray buffer_,
                                                                     jint size) {

    if (!isPublishing || !handle)
        return;

    jbyte *buffer = env->GetByteArrayElements(buffer_, NULL);

    unsigned char *outputBuffer = (unsigned char *) (malloc(
            sizeof(unsigned char) * maxOutputBytes));
    //編碼
    int len = faacEncEncode(handle, (int32_t *) buffer, inputSamples, outputBuffer,
                            maxOutputBytes);
    if (len > 0) {
//        LOGE("rtmp音頻推流");
        send_aac_body(outputBuffer, len);
    }

    env->ReleaseByteArrayElements(buffer_, buffer, 0);

    if (outputBuffer)
        free(outputBuffer);
}

比較難理解的是視頻編碼,之前介紹說h264主要是i幀,b幀,p幀,他們承載著像素?cái)?shù)據(jù),由于進(jìn)行了壓縮,可以這樣理解:壓縮后數(shù)據(jù)顯然沒有原數(shù)據(jù)的大小,所以原始數(shù)據(jù)的大?。ó嬅娴膶捀撸?,壓縮比例等信息也要存儲,他們就存儲在sps和pps中,類似于http的headers,播放時也需要用到這些信息(畢竟解碼時起碼要知道畫面的寬高吧),sps和pps的數(shù)據(jù)呢,又有4個字節(jié)作為分隔符,我們不需要這4個沒用數(shù)據(jù),所以要去掉它們

接下來就是將編碼后的音視頻數(shù)據(jù)進(jìn)行組包,成為RTMPPacket
先來組包視頻數(shù)據(jù),我們組包時參考下面的文檔
1.先是sps和pps,第一個字節(jié)為0x17,具體組包代碼如下:
/**
 * rtmp發(fā)送頭信息
 * @param sps 
 * @param pps 
 * @param len 
 * @param pps_len 
 */
void send_264_header(unsigned char *sps, unsigned char *pps, int sps_len, int pps_len) {
    int size = sps_len + pps_len + 16;//組包rtmp頭信息需要額外16個字節(jié)
    RTMPPacket *packet = static_cast<RTMPPacket *>(malloc(sizeof(RTMPPacket)));
    //初始化內(nèi)部緩沖區(qū)
    RTMPPacket_Alloc(packet, size);

    //組包
    unsigned char *body = reinterpret_cast<unsigned char *>(packet->m_body);
    int i = 0;
    body[i++] = 0x17;
    body[i++] = 0x00;
    body[i++] = 0x00;
    body[i++] = 0x00;
    body[i++] = 0x00;
    //版本號
    body[i++] = 0x01;
    //profile
    body[i++] = sps[1];
    //兼容性
    body[i++] = sps[2];
    //profile_level baseline
    body[i++] = sps[3];
    body[i++] = 0xff;
    body[i++] = 0xe1;
    //sps長度
    body[i++] = (sps_len >> 8) & 0xff;
    body[i++] = sps_len & 0xff;
    //sps內(nèi)容
    memcpy(&body[i], sps, sps_len);
    i += sps_len;//指針偏移長度

    //pps
    body[i++] = 0x01;
    //pps長度
    body[i++] = (pps_len >> 8) & 0xff;
    body[i++] = pps_len & 0xff;
    memcpy(&body[i], pps, pps_len);

    //packet參數(shù)設(shè)置
    packet->m_packetType = RTMP_PACKET_TYPE_VIDEO;//視頻類型
    packet->m_nBodySize = size;
    //客戶端通過pts自己做同步
    packet->m_nTimeStamp = 0;
    packet->m_hasAbsTimestamp = 0;
    //指定通道
    packet->m_nChannel = 4;
    packet->m_headerType = RTMP_PACKET_SIZE_MEDIUM;

    //放入隊(duì)列
    put(packet);
}
2.然后是關(guān)鍵幀和非關(guān)鍵幀:
/**
 * RTMP發(fā)送關(guān)鍵幀和非關(guān)鍵幀
 * @param payload 
 * @param i_payload 
 */
void send_264_body(uint8_t *payload, int i_payload) {
    if (payload[2] == 0x00) {//第三位為0x00的情況,無用信息為前4位:0000 0001
        payload += 4;
        i_payload -= 4;
    } else if (payload[2] == 0x01) {//第三位為0x01的情況,無用信息為前3位:0000 01
        payload += 3;
        i_payload -= 3;
    }

    //組包
    int size = i_payload + 9;//組包rtmp幀數(shù)據(jù)需要額外的9個字節(jié)
    RTMPPacket *packet = static_cast<RTMPPacket *>(malloc(sizeof(RTMPPacket)));
    //初始化內(nèi)部緩沖區(qū)
    RTMPPacket_Alloc(packet, size);

    char *body = packet->m_body;
    int type = payload[0] & 0x1f;
    int index = 0;
    if (type == NAL_SLICE_IDR) {//關(guān)鍵幀
        body[index++] = 0x17;
    } else {//非關(guān)鍵幀
        body[index++] = 0x27;
    }

    body[index++] = 0x01;
    body[index++] = 0x00;
    body[index++] = 0x00;
    body[index++] = 0x00;

    //長度,占4個字節(jié)
    body[index++] = (i_payload >> 24) & 0xff;
    body[index++] = (i_payload >> 16) & 0xff;
    body[index++] = (i_payload >> 8) & 0xff;
    body[index++] = i_payload & 0xff;

    //存放數(shù)據(jù)
    memcpy(&body[index], payload, i_payload);

    //packet參數(shù)設(shè)置
    packet->m_packetType = RTMP_PACKET_TYPE_VIDEO;//視頻類型
    packet->m_nBodySize = size;
    //客戶端通過pts自己做同步
    packet->m_nTimeStamp = RTMP_GetTime() - start_time;//為了讓客戶端知道播放進(jìn)度
    packet->m_hasAbsTimestamp = 0;
    //指定通道
    packet->m_nChannel = 0x04;
    packet->m_headerType = RTMP_PACKET_SIZE_LARGE;

    put(packet);
}
最后組包音頻數(shù)據(jù),先看下音頻數(shù)據(jù)的文檔:
音頻組包很簡單,代碼如下:
/**
 * 組包音頻packet
 * @param buffer 
 * @param len 
 */
void send_aac_body(unsigned char *buffer, int len) {
    int size = len + 2;

    RTMPPacket *packet = static_cast<RTMPPacket *>(malloc(sizeof(RTMPPacket)));
    //初始化內(nèi)部緩沖區(qū)
    RTMPPacket_Alloc(packet, size);

    char *body = packet->m_body;
    body[0] = 0xAF;
    body[1] = 0x01;
    memcpy(&body[2], buffer, len);

    //packet參數(shù)設(shè)置
    packet->m_packetType = RTMP_PACKET_TYPE_AUDIO;//音頻類型
    packet->m_nBodySize = size;
    //客戶端通過pts自己做同步
    packet->m_nTimeStamp = RTMP_GetTime() - start_time;//為了讓客戶端知道播放進(jìn)度
    packet->m_hasAbsTimestamp = 0;
    //指定通道
    packet->m_nChannel = 0x04;
    packet->m_headerType = RTMP_PACKET_SIZE_MEDIUM;
    put(packet);
}
終于,直播推流代碼完成了,趕緊跑下看下效果吧
項(xiàng)目地址:https://gitee.com/aruba/rtmp-push-application.git
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖,帶你破解...
    沈念sama閱讀 228,546評論 6 533
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 98,570評論 3 418
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人,你說我怎么就攤上這事?!?“怎么了?”我有些...
    開封第一講書人閱讀 176,505評論 0 376
  • 文/不壞的土叔 我叫張陵,是天一觀的道長。 經(jīng)常有香客問我,道長,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 63,017評論 1 313
  • 正文 為了忘掉前任,我火速辦了婚禮,結(jié)果婚禮上,老公的妹妹穿的比我還像新娘。我一直安慰自己,他們只是感情好,可當(dāng)我...
    茶點(diǎn)故事閱讀 71,786評論 6 410
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著,像睡著了一般。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 55,219評論 1 324
  • 那天,我揣著相機(jī)與錄音,去河邊找鬼。 笑死,一個胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播,決...
    沈念sama閱讀 43,287評論 3 441
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 42,438評論 0 288
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 48,971評論 1 335
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 40,796評論 3 354
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點(diǎn)故事閱讀 42,995評論 1 369
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情,我是刑警寧澤,帶...
    沈念sama閱讀 38,540評論 5 359
  • 正文 年R本政府宣布,位于F島的核電站,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 44,230評論 3 347
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧,春花似錦、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 34,662評論 0 26
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 35,918評論 1 286
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留,地道東北人。 一個月前我還...
    沈念sama閱讀 51,697評論 3 392
  • 正文 我出身青樓,卻偏偏與公主長得像,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 47,991評論 2 374

推薦閱讀更多精彩內(nèi)容