系列文章:
安卓上我們經常會使用MediaPlayer這個類去播放音頻和視頻,這篇筆記便從MediaPlayer著手,一層層分析安卓的音視頻播放框架。
MediaPlayer
MediaPlayer的使用很簡單,如果是想要在一個SurfaceView上播放,assets下的video.mp4視頻,只需要下面的幾行代碼就能在手機上看到視頻畫面了:
SurfaceView surfaceView = (SurfaceView) findViewById(R.id.surface);
surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder holder) {
MediaPlayer player = new MediaPlayer();
player.setDisplay(holder); //設置畫面顯示在哪
try {
player.setDataSource(getAssets().openFd("video.mp4")); //設置視頻源
player.prepare(); //準備視頻數據
} catch (IOException e) {
e.printStackTrace();
}
player.start(); //開始播放
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
});
MediaPlayer的API和用法很簡單,基本上只需要看熟谷歌官方給的這幅狀態圖,就能很方便的使用了。其實大部分情況下無非也就是setDataSource、prepare、start、pause、stop、reset、release這幾個方法的調用:
具體的使用細節我這邊就不去贅述了,感興趣的可以參考下官方文檔。
安卓Media框架
我們在應用里面調了MediaPlayer的方法,其實底層都會通過IPC機制調到MediaPlayerService。其實不僅是MediaPlayer,android.media包下的媒體播放接口像AudioTrack、SoundPool、MediaCodec都是會調到MediaPlayerService去做具體的編解碼操作的,安卓的媒體播放是個典型的C/S架構,可以參考下官方文檔的架構圖:
使用C/S架構的好處就是可以比較方便的統一管理軟硬件編解碼資源.
整個框架除了java層的MediaPlayer之外還涉及三個關鍵so庫:
- libmedia_jni.so 負責使用jni連接java層和native層,然后調用MediaPlayer類提供的接口
- libmedia.so 對上層提供了MediaPlayer類負責客戶端與MediaPlayerService的IPC通訊
- libmediaplayerservice.so 負責統籌調度具體的編碼器和解碼器,它內部也實現了libmedia.so的IMediaPlayer類用于接收客戶端通過IPC機制發送的指令
他們的依賴關系如下
代碼細節
然后我們來追蹤下具體的代碼實現.
其實android.media.MediaPlayer這個java類只是native層的一個代理,具體的實現都是通過jni調用到libmedia_jni.so里面的c/c++代碼:
public class MediaPlayer extends PlayerBase implements SubtitleController.Listener {
...
static {
System.loadLibrary("media_jni");
native_init();
}
...
private static native final void native_init();
...
private native void _setVideoSurface(Surface surface);
...
private native void _prepare() throws IOException, IllegalStateException;
...
private native void _start() throws IllegalStateException;
...
public void setDataSource(FileDescriptor fd, long offset, long length) throws IOException, IllegalArgumentException, IllegalStateException {
_setDataSource(fd, offset, length);
}
...
public void setDisplay(SurfaceHolder sh) {
mSurfaceHolder = sh;
Surface surface;
if (sh != null) {
surface = sh.getSurface();
} else {
surface = null;
}
_setVideoSurface(surface);
updateSurfaceScreenOn();
}
...
public void prepare() throws IOException, IllegalStateException {
_prepare();
scanInternalSubtitleTracks();
}
...
public void start() throws IllegalStateException {
baseStart();
stayAwake(true);
_start();
}
...
}
libmedia_jni.so的實現可以在/frameworks/base/media/jni/android_media_MediaPlayer.cpp里面找到:
static void
android_media_MediaPlayer_native_setup(JNIEnv *env, jobject thiz, jobject weak_this)
{
sp<MediaPlayer> mp = new MediaPlayer();
...
setMediaPlayer(env, thiz, mp);
}
static void
android_media_MediaPlayer_prepare(JNIEnv *env, jobject thiz)
{
sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
...
sp<IGraphicBufferProducer> st = getVideoSurfaceTexture(env, thiz);
mp->setVideoSurfaceTexture(st);
process_media_player_call( env, thiz, mp->prepare(), "java/io/IOException", "Prepare failed." );
}
static void
android_media_MediaPlayer_start(JNIEnv *env, jobject thiz)
{
sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
...
process_media_player_call( env, thiz, mp->start(), NULL, NULL );
}
...
而libmedia_jni.so內部也是依賴了MediaPlayer這個類去干活,它的代碼可以在/frameworks/av/include/media/mediaplayer.h和/frameworks/av/media/libmedia/mediaplayer.cpp找到,而它編譯之后打包在libmedia.so中
上面我們看到libmedia_jni.so里面調用了MediaPlayer的方法去干活,那MediaPlayer又是怎么干活的呢,看看具體代碼:
//mediaplayer.h
sp<IMediaPlayer> mPlayer;
//mediaplayer.cpp
status_t MediaPlayer::prepare()
{
...
status_t ret = prepareAsync_l();
...
}
status_t MediaPlayer::prepareAsync_l()
{
...
return mPlayer->prepareAsync();
...
}
這里又依賴了一個IMediaPlayer,讓我們繼續挖一挖這個IMediaPlayer又是什么來的:
status_t MediaPlayer::attachNewPlayer(const sp<IMediaPlayer>& player)
{
...
mPlayer = player;
...
}
status_t MediaPlayer::setDataSource(
const sp<IMediaHTTPService> &httpService,
const char *url, const KeyedVector<String8, String8> *headers)
{
ALOGV("setDataSource(%s)", url);
status_t err = BAD_VALUE;
if (url != NULL) {
const sp<IMediaPlayerService> service(getMediaPlayerService());
if (service != 0) {
sp<IMediaPlayer> player(service->create(this, mAudioSessionId));
if ((NO_ERROR != doSetRetransmitEndpoint(player)) ||
(NO_ERROR != player->setDataSource(httpService, url, headers))) {
player.clear();
}
err = attachNewPlayer(player);
}
}
return err;
}
// MediaPlayer繼承IMediaDeathNotifier
IMediaDeathNotifier::getMediaPlayerService() {
Mutex::Autolock _l(sServiceLock);
if (sMediaPlayerService == 0) {
sp<IServiceManager> sm = defaultServiceManager();
sp<IBinder> binder;
do {
binder = sm->getService(String16("media.player"));
if (binder != 0) {
break;
}
usleep(500000); // 0.5 s
} while (true);
if (sDeathNotifier == NULL) {
sDeathNotifier = new DeathNotifier();
}
binder->linkToDeath(sDeathNotifier);
sMediaPlayerService = interface_cast<IMediaPlayerService>(binder);
}
return sMediaPlayerService;
}
可以看到getMediaPlayerService方法實際是從ServiceManager里面獲取了"media.player"這個服務,然后拿到了IMediaPlayerService的Binder代理,又去到了MediaPlayerService::create方法:
sp<IMediaPlayer> MediaPlayerService::create(const sp<IMediaPlayerClient>& client,
audio_session_t audioSessionId)
{
pid_t pid = IPCThreadState::self()->getCallingPid();
int32_t connId = android_atomic_inc(&mNextConnId);
sp<Client> c = new Client(
this, pid, connId, client, audioSessionId,
IPCThreadState::self()->getCallingUid());
ALOGV("Create new client(%d) from pid %d, uid %d, ", connId, pid,
IPCThreadState::self()->getCallingUid());
wp<Client> w = c;
{
Mutex::Autolock lock(mLock);
mClients.add(w);
}
return c;
}
MediaPlayerService會創建一個Client返回給客戶端,客戶端這個Client調用到MediaPlayerService的功能了。順嘴說一句,Client是MediaPlayerService的一個內部類,它繼承了BnMediaPlayerService,而BnMediaPlayer又繼承了BnInterface<IMediaPlayer>
//MediaPlayerService.h
class MediaPlayerService : public BnMediaPlayerService {
...
class Client : public BnMediaPlayer {
...
}
...
}
//IMediaPlayer.h
class BnMediaPlayer: public BnInterface<IMediaPlayer>
{
public:
virtual status_t onTransact( uint32_t code,
const Parcel& data,
Parcel* reply,
uint32_t flags = 0);
};
MediaPlayerService的工作原理
查看MediaPlayerService的源碼,可以知道在setDataSource的時候查找支持該源的播放器,然后創建出來使用:
status_t MediaPlayerService::Client::setDataSource(const sp<IMediaHTTPService> &httpService, const char *url, const KeyedVector<String8, String8> *headers)
{
...
player_type playerType = MediaPlayerFactory::getPlayerType(this, url);
sp<MediaPlayerBase> p = setDataSource_pre(playerType);
...
setDataSource_post(p, p->setDataSource(httpService, url, headers));
...
}
sp<MediaPlayerBase> MediaPlayerService::Client::setDataSource_pre(player_type playerType) {
...
sp<MediaPlayerBase> p = createPlayer(playerType);
...
}
sp<MediaPlayerBase> MediaPlayerService::Client::createPlayer(player_type playerType)
{
sp<MediaPlayerBase> p = mPlayer;
...
p = MediaPlayerFactory::createPlayer(playerType, this, notify, mPid);
...
return p;
}
可以看到內部都是通過MediaPlayerFactory這個工廠去實現的,MediaPlayerFactory::registerBuiltinFactories方法注冊了一些播放器,根據音視頻源選擇合適的播放器去播放。值得強調的是在sdk 23及以前的系統中會有StagefrightPlayer、NuPlayer兩個播放器,sdk 24之后,真正工作的播放器就只有一個NuPlayer了。當然,各個廠家自己的提供的播放器也可以在這里注冊,像小米盒子的ROM就導入過VLC框架的播放器。由于我司還有大量的安卓4.4的機器,所以我這里會把兩個播放器都講一下。
// android sdk 23
void MediaPlayerFactory::registerBuiltinFactories() {
Mutex::Autolock lock_(&sLock);
if (sInitComplete)
return;
registerFactory_l(new StagefrightPlayerFactory(), STAGEFRIGHT_PLAYER);
registerFactory_l(new NuPlayerFactory(), NU_PLAYER);
registerFactory_l(new TestPlayerFactory(), TEST_PLAYER);
sInitComplete = true;
}
// android sdk 24
void MediaPlayerFactory::registerBuiltinFactories() {
Mutex::Autolock lock_(&sLock);
if (sInitComplete)
return;
registerFactory_l(new NuPlayerFactory(), NU_PLAYER);
registerFactory_l(new TestPlayerFactory(), TEST_PLAYER);
sInitComplete = true;
}
StagefrightPlayer實際上指的是AwesomePlayer,在早期的安卓系統使用AwesomePlayer去播放本地視頻,用NuPlayer去播放流媒體。后來因為某些原因(具體原因我沒有找到,只是說AwesomePlayer有問題)所以逐漸用棄用了AwesomePlayer,統一使用NuPlayer去播放。在某些過度版本的安卓系統開發者選項里面還可以選擇NuPlayer代替AwesomePlayer,到后期都不用選了,只有一個NuPlayer可以用。
關于AwesomePlayer和NuPlayer的具體代碼實現,我會在下篇文章繼續解析.讓我們繼續講這兩個播放器都依賴的OpenMax框架.
OpenMax(OMX)框架
開放多媒體加速層(英語:Open Media Acceleration,縮寫為OpenMAX),一個不需要授權、跨平臺的軟件抽象層,以C語言實現的軟件界面,用來處理多媒體。它是由Khronos Group提出的標準,也由他們來維持,目標在于創造一個統一的界面,加速大量多媒體資料的處理。
也就是說OpenMax提供了具體的軟硬件編解碼能力,AwesomePlayer和NuPlayer依賴它,就能實現編解碼功能.
OpenMax分成三層:
開發層(Development Layer,DL)
這一層定義了一些基礎的音頻、視頻以及圖像算法,比如音頻信號處理的快速傅立葉變換、濾波器,圖像處理的色域轉換(RGB、YUV等)、視頻處理的MPEG-4, H.264, MP3, AAC 和 JPEG編解碼等.
DL層分為五個應用領域:
AC - 音頻編解碼器
IC - 圖像編解碼器
IP - 圖像處理(通用圖像處理功能)
SP - 信號處理(通用音頻處理功能)
VC - 視頻編解碼器(H264和MP4組件)
它們都是一些比較算法層面的接口,由芯片原廠實現
整合層(Integration Layer,IL)
這一層整合了DL層的算法和功能,作為一個比較低層級的編解碼器接口,也就是說實現了這一層的接口就實現了一個編解碼器.它可以是軟件的也可以是硬件的
應用層(Application Layer,AL)
AL層為多媒體中間件與應用層之間提供一個標準化的API接口,不同的系統都應有對應的實現,應用程序依賴這一層的接口進行編程,就能獲得很好的跨平臺特性.
完整框架圖
到這里,整個音視頻播放架構就很清晰了