前言
近期項目中需要完成一個實現屏幕錄制(包含畫面、麥克風、app內聲音)功能,并壓縮上傳服務器,因此對iOS系統的replaykit進行了初步的研究,現分享一下結果:
概述
基于目前項目的快速迭代要求,首先想到的是官方ReplayKit框架,初步調研發現ReplayKit框架最低要求是iOS9.0,且支持屏幕、麥克風、app聲音的錄制,滿足技術可行性,因此決定直接采用ReplayKit實施。
ReplayKit介紹
ReplayKit在WWDC15的時候隨iOS9.0推出。當時的目的是給游戲開發者錄制玩游戲的視頻,進行社交分享使用。 除了錄制和共享外,ReplayKit還包括一個功能齊全的用戶界面,玩家可以用來編輯其視頻剪輯。
Replaykit功能介紹視頻 WWDC15
ReplayKit除了實現屏幕錄制以外,還能夠將錄制的音視頻流實時廣播出去,對于iOS端,需要兩個關鍵技術:屏幕內容采集和媒體流廣播。前者需要系統提供相關權限,可以讓開發者采集到app或者整個系統層面的屏幕上的內容,后者需要系統提供采集到實時的視頻流和音頻流,這樣才能通過推流到服務器,實現媒體流的廣播。
錄制
iOS9.0
//頭文件
#import <ReplayKit/ReplayKit.h>
//啟動錄制
- (void)startRecordingWithMicrophoneEnabled:(BOOL)microphoneEnabled handler:(nullable void (^)(NSError *_Nullable error))handler API_DEPRECATED("Use microphoneEnabled property", ios(9.0, 10.0)) API_UNAVAILABLE(macOS);
//停止錄制
- (void)stopRecordingWithHandler:(nullable void (^)(RPPreviewViewController *_Nullable previewViewController, NSError *_Nullable error))handler;
通過stopRecordingWithHandler的api,回調previewViewController(預覽頁面),通過presentViewController推出預覽頁,可以:裁剪、分享、保存相冊
[self presentViewController:previewViewController animated:YES completion:^{}];
預覽頁監聽操作結果
#pragma mrak - RPPreviewViewControllerDelegate
- (void)previewController:(RPPreviewViewController *)previewController didFinishWithActivityTypes:(NSSet <NSString *> *)activityTypes
{
if ([activityTypes containsObject:@"com.apple.UIKit.activity.SaveToCameraRoll"]) {
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(@"保存成功");
});
}
if ([activityTypes containsObject:@"com.apple.UIKit.activity.CopyToPasteboard"]) {
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(@"復制成功");
});
}
}
- (void)previewControllerDidFinish:(RPPreviewViewController *)previewController
{
[previewController dismissViewControllerAnimated:YES completion:^{
}];
}
通過攔截RPPreViewController,打印錄制視頻的地址:videoUrl = file:///private/var/mobile/Library/ReplayKit/ReplaykitDemo_06-28-2021%2015-51-13_1.mp4
可以發現文件存在于系統的位置,所以無法直接獲取
總結:
優點:
高度封裝,操作簡單,能夠快速的實現屏幕錄制功能。
缺點:
- 不能獲取到視頻錄制時的數據,只能在停止錄制視頻的時候獲取到蘋果已經處理合成好的MP4文件
- 不能直接獲取錄制好的視頻文件,需要先通過用戶存儲到相冊,你才能通過相冊去訪問到該文件、
- 停止錄制的時候需要彈出一個視頻的預覽窗口,你可以在這個窗口進行保存或者取消或者分享該視頻文件、你還可以直接編輯該視頻
- 由于上面的限制,你只能在用戶存儲錄制的視頻保存到相冊你才能訪問。想要上傳該視頻到服務器,你還需要把相冊的那個視頻先想辦法copy到沙盒中,然后再開始上傳服務器。
- 無法配置屏幕錄制參數
iOS10.0
優化內容:
//新增啟動錄制
- (void)startRecordingWithHandler:(nullable void (^)(NSError *_Nullable error))handler API_AVAILABLE(ios(10.0), tvos(10.0), macos(11.0));
//通過microphoneEnabled 控制是否開啟麥克風
@property (nonatomic, getter = isMicrophoneEnabled) BOOL microphoneEnabled API_UNAVAILABLE(tvOS);
//結束錄制以及錄制完成后跳轉預覽頁做編輯操作同iOS9.0保持一致
總結:同iOS9.0
新增內容
iOS 10 系統在 iOS 9 系統的 ReplayKit保存錄屏視頻的基礎上,增加了視頻流實時直播功能(streaming live),可以將廣播出來的直播流進行分發和直播。具體實現是通過增加ReplayKit的擴展分別為Broadcast Upload Extension 和 Broadcast Setup UI Extension,
Broadcast Upload Extension
是處理捕捉到App屏幕錄制的數據的
Broadcast Setup UI Extension
一些關于屏幕捕捉的UI交互
步驟:
- 添加擴展插件file->new->target->
Broadcast upload Extension
系統會生成兩個target,兩個對應的目錄以及4個文件分別:
SampleHandler.h
SampleHandler.m
BroadcastSetupViewController.h
BroadcastSetupViewController.m
SampleHandler
主要處理流數據RPSampleBufferTypeVideo、RPSampleBufferTypeAudioApp、RPSampleBufferTypeAudioMic
,BroadcastSetupViewController
作為啟動進程間插入的交互頁面,可以用于用戶輸入信息鑒權,或者自定義其他界面
- 啟動備選界面
//啟動備選界面
+ (void)loadBroadcastActivityViewControllerWithHandler:(void (^)(RPBroadcastActivityViewController *_Nullable broadcastActivityViewController, NSError *_Nullable error))handler;
[RPBroadcastActivityViewController loadBroadcastActivityViewControllerWithHandler:^(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error) {
if (error) {
NSLog(@"RPBroadcast err %@", [error localizedDescription]);
}
broadcastActivityViewController.delegate = self;
[self presentViewController:broadcastActivityViewController animated:YES completion:nil];
}];
- 通過代理回調,啟動錄制進程
#pragma mark - Broadcasting
- (void)broadcastActivityViewController:(RPBroadcastActivityViewController *) broadcastActivityViewController
didFinishWithBroadcastController:(RPBroadcastController *)broadcastController
error:(NSError *)error {
[broadcastActivityViewController dismissViewControllerAnimated:YES completion:nil];
self.broadcastController = broadcastController;
self.broadcastController.delegate = self;
if (error) {
return;
}
//啟動廣播
[broadcastController startBroadcastWithHandler:^(NSError * _Nullable error) {
if (!error) {
NSLog(@"-----start success----");
// 這里可以添加camerPreview
} else {
NSLog(@"startBroadcast:%@",error.localizedDescription);
}
}];
}
- UI交互配置
- (void)userDidFinishSetup {
NSURL *broadcastURL = [NSURL URLWithString:@"http://apple.com/broadcast/streamID"];
NSDictionary *setupInfo = @{ @"broadcastName" : @"example" };
// Tell ReplayKit that the extension is finished setting up and can begin broadcasting
[self.extensionContext completeRequestWithBroadcastURL:broadcastURL setupInfo:setupInfo];
}
- (void)userDidCancelSetup {
[self.extensionContext cancelRequestWithError:[NSError errorWithDomain:@"YourAppDomain" code:-1 userInfo:nil]];
}
- (void)viewWillAppear:(BOOL)animated
{
[self userDidFinishSetup];
}
- 數據流的接收與處理
- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
// User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
}
- (void)broadcastPaused {
// User has requested to pause the broadcast. Samples will stop being delivered.
}
- (void)broadcastResumed {
// User has requested to resume the broadcast. Samples delivery will resume.
}
- (void)broadcastFinished {
// User has requested to finish the broadcast.
}
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
// Handle video sample buffer
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
break;
default:
break;
}
}
processSampleBuffer方法就是最終采集到的音頻、視頻原始數據。其中音頻未做混音,包括麥克音頻pcm和app音頻pcm,而視頻輸出為yuv數據。
總結:
優點:
- 除了錄屏以外,新增直播特性,功能更加強大
- 能夠拿到音視頻原始流數據,滿足一些需要做音視頻特效的需求
缺點:
- 增加用戶交互成本,需要拉起錄制列表,然后用戶點擊選擇對應的錄制程序,操作成功相對高一些
- 集成難度相比于iOS9.0加大,處理原始數據難度比較大
iOS11.0
新增內容
新增api,跳過iOS10的中間列表sheet在點擊選擇的過程,但是還是只能錄制app內的內容。
+ (void)loadBroadcastActivityViewControllerWithPreferredExtension:(NSString * _Nullable)preferredExtension handler:(nonnull void(^)(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error))handler API_AVAILABLE(ios(11.0)) API_UNAVAILABLE(tvOS);
處理的流程同iOS10的擴展插件
新增開啟屏幕捕捉
開啟捕捉回調sampleBuffer
- (void)startCaptureWithHandler:(nullable void (^)(CMSampleBufferRef sampleBuffer, RPSampleBufferType bufferType, NSError *_Nullable error))captureHandler completionHandler:(nullable void (^)(NSError *_Nullable error))completionHandler API_AVAILABLE(ios(11.0), tvos(11.0), macos(11.0));
可以直接調用接口捕捉到sampleBuffer,省去了iOS10的擴展插件環節,可以直接拿到想要的buffer裸數據,無需中間交互環節,完成滿足最上面所說的項目要求
總結:
優點:
- 調用方法簡單,易于集成
- 無中間用戶交互環節,用戶交互成本低
- 直接獲取到音視頻裸數據
缺點:
裸數據處理難度稍大
補充
音視頻裸數據編碼合成mp4寫入本地沙盒
- iOS端編碼合成采用
AVAssetWriter
,配套AVAssetWriterInput
使用
//writer
@property (nonatomic, strong) AVAssetWriter *assetWriter;
//視頻輸入
@property (nonatomic, strong) AVAssetWriterInput *assetWriterVideoInput;
//音頻輸入
@property (nonatomic, strong) AVAssetWriterInput *assetWriterAudioInput;
//app內音頻輸入
@property (nonatomic, strong) AVAssetWriterInput *assetWriterAppAudioInput;
//初始化
self.assetWriter = [AVAssetWriter assetWriterWithURL:[NSURL fileURLWithPath:videoOutPath] fileType:AVFileTypeMPEG4 error:&error];
2.視頻編碼配置
//視頻的配置
NSDictionary *compressionProperties = @{
AVVideoProfileLevelKey : AVVideoProfileLevelH264HighAutoLevel,
AVVideoH264EntropyModeKey : AVVideoH264EntropyModeCABAC,
AVVideoAverageBitRateKey : @(DEVICE_WIDTH * DEVICE_HEIGHT * 6.0),
AVVideoMaxKeyFrameIntervalKey : @15,
AVVideoExpectedSourceFrameRateKey : @(15),
AVVideoAllowFrameReorderingKey : @NO};
NSNumber* width= [NSNumber numberWithFloat:DEVICE_WIDTH];
NSNumber* height = [NSNumber numberWithFloat:DEVICE_HEIGHT];
NSDictionary *videoSettings = @{
AVVideoCompressionPropertiesKey :compressionProperties,
AVVideoCodecKey :AVVideoCodecTypeH264,
AVVideoWidthKey : width,
AVVideoHeightKey: height
};
self.assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
3.音頻編碼配置
// 音頻設置
NSDictionary * audioCompressionSettings = @{ AVEncoderBitRatePerChannelKey : @(28000),
AVFormatIDKey : @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey : @(1),
AVSampleRateKey : @(22050) };
self.assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
4.input添加writer
//視頻
[self.assetWriter addInput:self.assetWriterVideoInput];
[self.assetWriterVideoInput setMediaTimeScale:60];
[self.assetWriterVideoInput setExpectsMediaDataInRealTime:YES];
[self.assetWriter setMovieTimeScale:60];
//音頻
[self.assetWriter addInput:self.assetWriterAudioInput];
self.assetWriterAudioInput.expectsMediaDataInRealTime = YES;
//app內聲音
[self.assetWriter addInput:self.assetWriterAppAudioInput];
self.assetWriterAppAudioInput.expectsMediaDataInRealTime = YES;
5.合并代碼
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
if (CMSampleBufferDataIsReady(sampleBuffer)) {
if (self.assetWriter.status == AVAssetWriterStatusUnknown && bufferType == RPSampleBufferTypeVideo) {
[self.assetWriter startWriting];
[self.assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
}
if (self.assetWriter.status == AVAssetWriterStatusFailed) {
NSLog(@"An error occured.");
[self writeDidOccureError:self.assetWriter.error callBack:handler];
return;
}
if (bufferType == RPSampleBufferTypeVideo) {
if (self.assetWriterVideoInput.isReadyForMoreMediaData) {
[self.assetWriterVideoInput appendSampleBuffer:sampleBuffer];
}
}else if (bufferType == RPSampleBufferTypeAudioMic)
{
if (self.assetWriterAudioInput.isReadyForMoreMediaData) {
[self.assetWriterAudioInput appendSampleBuffer:sampleBuffer];
[self sampleBuffer2PcmData:sampleBuffer];
}
}else if (bufferType == RPSampleBufferTypeAudioApp)
{
if (self.assetWriterAppAudioInput.isReadyForMoreMediaData) {
[self.assetWriterAppAudioInput appendSampleBuffer:sampleBuffer];
}
}
} completionHandler:^(NSError * _Nullable error) {
if (!error) {
// Start recording
NSLog(@"Recording started successfully.");
}else{
//show alert
}
}];
音頻解碼獲取聲音大小
關鍵的代碼
/// buffer轉pcm
/// @param audiobuffer
- (void)sampleBuffer2PcmData:(CMSampleBufferRef)audiobuffer
{
CMSampleBufferRef ref = audiobuffer;
if(ref==NULL){
return;
}
//copy data to file
//read next one
AudioBufferList audioBufferList;
NSMutableData *data=[[NSMutableData alloc] init];
CMBlockBufferRef blockBuffer;
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(ref, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);
for( int y=0; y<audioBufferList.mNumberBuffers; y++ )
{
AudioBuffer audioBuffer = audioBufferList.mBuffers[y];
Float32 *frame = (Float32*)audioBuffer.mData;
[data appendBytes:frame length:audioBuffer.mDataByteSize];
}
[self volumeFromPcmData:data] ;
CFRelease(blockBuffer);
blockBuffer=NULL;
}
/// 通過pcmdata獲取聲音分貝
/// @param pcmData pcm
-(void)volumeFromPcmData:(NSData *)pcmData
{
if (pcmData == nil)
{
if ([self.delegate respondsToSelector:@selector(screenRecord:micVolume:)]) {
[self.delegate screenRecord:self micVolume:0];
}
return;
}
long long pcmAllLenght = 0;
short butterByte[pcmData.length/2];
memcpy(butterByte, pcmData.bytes, pcmData.length);//frame_size * sizeof(short)
// 將 buffer 內容取出,進行平方和運算
for (int i = 0; i < pcmData.length/2; I++)
{
pcmAllLenght += butterByte[i] * butterByte[I];
}
// 平方和除以數據總長度,得到音量大小。
double mean = pcmAllLenght / (double)pcmData.length;
double volume =10*log10(mean);//volume為分貝數大小
/*
*0-20 很靜 幾乎感覺不到
20-40 安靜
40-60一般室內談話
60-70吵鬧
70-90很吵、神經細胞受到破壞
90-100吵鬧家具 聽力受損
*/
if ([self.delegate respondsToSelector:@selector(screenRecord:micVolume:)]) {
[self.delegate screenRecord:self micVolume:volume];
}
}
總結
通過以上各個系統版本的對比,最終項目采用了iOS11的startCaptureWithHandler接口實現屏幕錄制數據采集,然后通過AVAssetWriter進行編碼合成mp4文件以及通過音頻裸數據提取聲音,最終完成該需求。以上均為代碼的片段,還需要集合業務考慮各種異常情況的處理,以及視頻、音頻的編碼配置需要進一步研究,通過優化配置參數,能夠進一步提升錄制視頻的體驗,整個過程坑點有點進一步補充。