1. 前言
首先本次的目的是實現iOS 屏幕的采集, 包含系統屏幕和 App
內部屏幕的畫面, 同時需要在 App
內部喚起直播, 基于以上的需我們需要 iOS12 之后的技術, 使用ReplayKit iOS12 之后相關 api 才能完成, 然后由于使用擴展程序的諸多限制, 比如內存限制不能超過 50M等.
所以這次需求需要
- 從擴展 app 向宿主 app 傳輸視頻幀數據有兩種方式
采用 socket進行進程間Broadcast Unload Extension 向 宿主 app 傳輸數據
采用 App Group
需要后臺保活持續采集屏幕數據
在宿主 App 進行視頻數據編碼
宿主 app 和擴展 app 同時使用公用 iOS 工具類, 所以還需要創建一個 framwork
基于以上目的我們準備
編譯環境 Xcode14.2, iOS12
創建 Broadcast Unload Extension
程序永久?;?/p>
創建 framework 供 Broadcast Unload Extension 和宿主 app 調用共用類
系統屏幕數據采集
app 內屏幕共享
2. 第一步創建 Broadcast Unload Extension
步驟: File -> new -> Target
創建好之后生成 一個擴展 App, 自動生成如圖的一個 sampleHandr類, sampleHandr用來持續采集視頻,音頻幀數據
broadcastStartedWithSetupInfo 宿主 app開始直播屏幕的時候這里會走一次
processSampleBuffer 這個方法會實時回到
- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo {
// User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
// 宿主 app開始直播屏幕的時候這里會走一次
// 設置 socket
// 其中 FIAgoraSampleHandlerSocketManager這個類可以看 Demo 的實現
[[FIAgoraSampleHandlerSocketManager sharedManager] setUpSocket];
}
- (void)broadcastPaused {
// User has requested to pause the broadcast. Samples will stop being delivered.
}
- (void)broadcastResumed {
// User has requested to resume the broadcast. Samples delivery will resume.
}
- (void)broadcastFinished {
// User has requested to finish the broadcast.
}
// 實時采集數據
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
// Handle video sample buffer
// 發送視頻數據導宿主 App
[[FIAgoraSampleHandlerSocketManager sharedManager] sendVideoBufferToHostApp:sampleBuffer];
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
// 處理音頻
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
// 麥克風
break;
default:
break;
}
}
3. FIAgoraSampleHandlerSocketManager 關于數據傳輸的類 都放到一個framework 當中所以首先創建一個 framwork
步驟: File -> new -> Target 創建 framework
創建好之后在宿主 app 和 extension 分別引用, 如圖 2
4. 宿主 App
手動啟動直播, UI 是固定樣式的所以需要一些操作改變系統 UI 樣式
需要永久?;? 這里之前我的理解是開啟直播, 系統會自動完成app?;? 但是我的直播總是莫名的中斷, 所以這個暫時我這邊來看是必須得
socket block 監測數據回調
編碼, 由于視頻數據其實簡單來說是有很多多余數據在的, 需要進行壓縮, 裁剪等, 使視頻再不丟幀的情況下傳輸, 就叫做編碼, 一般編碼的為 H264 數據
編碼后的數據進行推流
4.1 初始化開啟直播的按鈕
self.broadcastPickerView.preferredExtension 這個用來綁定擴展的 bundleId, 這樣開啟直播的時候, 系統頁面就會只展示你自己的擴展了
改變系統提供的按鈕的 UI, 這里有個風險, 以后可能會失效, 暫時用沒有什么問題
// 設置系統的廣播 Picker 視圖
- (void)setupSystemBroadcastPickerView
{
// 兼容 iOS12 或更高的版本
if (@available(iOS 12.0, *)) {
self.broadcastPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(50, 200, 100, 100)];
self.broadcastPickerView.preferredExtension = @"summerxx.com.screen-share-ios.broadcast-extension";
self.broadcastPickerView.backgroundColor = UIColor.cyanColor;
self.broadcastPickerView.showsMicrophoneButton = NO;
[self.view addSubview:self.broadcastPickerView];
}
// 改變系統提供的按鈕的 UI, 這里有個風險, 以后可能會失效, 暫時用沒有什么問題
UIButton *startButton = [UIButton buttonWithType:UIButtonTypeCustom];
startButton.frame = CGRectMake(50, 310, 100, 100);
startButton.backgroundColor = UIColor.cyanColor;
[startButton setTitle:@"開啟攝像頭" forState:UIControlStateNormal];
[startButton setTitleColor:UIColor.blackColor forState:UIControlStateNormal];
[startButton addTarget:self action:@selector(startAction) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:startButton];
}
4.2 永久保活, 這里采用的是持續播放音頻
// 監聽
[[NSNotificationCenter defaultCenter]addObserver:self selector:@selector(didEnterBackGround) name:UIApplicationDidEnterBackgroundNotification object:nil];
[[NSNotificationCenter defaultCenter]addObserver:self selector:@selector(willEnterForeground) name:UIApplicationWillEnterForegroundNotification object:nil];
- (void)willEnterForeground
{
// 這里具體可看 Demo
[[FJDeepSleepPreventerPlus sharedInstance] stop];
}
- (void)didEnterBackGround
{
[[FJDeepSleepPreventerPlus sharedInstance] start];
}
4.3 數據回調
__weak __typeof(self) weakSelf = self;
[FIAgoraClientBufferSocketManager sharedManager].testBlock = ^(NSString * testText, CMSampleBufferRef sampleBuffer) {
// 進行視頻編碼
[weakSelf.h264code encodeSampleBuffer:sampleBuffer H264DataBlock:^(NSData * data) {
NSLog(@"%@", data);
// 編碼后可進行推流流程
}];
};
以上就是使用 socket數據傳輸視頻幀, 以及我遇到的一些細節問題
5. 使用 App Group 進行數據傳輸
在 extension 創建一個 App Group
創建一個 NSUserDefaults 綁定 App Group
賦值 NSUserDefaults 傳輸
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType
{
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
{
// Handle video sample buffer
@autoreleasepool {
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
float cropRate = (float)CVPixelBufferGetWidth(pixelBuffer) / (float)CVPixelBufferGetHeight(pixelBuffer);
CGSize targetSize = CGSizeMake(540, 960);
NTESVideoPackOrientation targetOrientation = NTESVideoPackOrientationPortrait;
if (@available(iOS 11.0, *)) {
CFStringRef RPVideoSampleOrientationKeyRef = (__bridge CFStringRef)RPVideoSampleOrientationKey;
NSNumber *orientation = (NSNumber *)CMGetAttachment(sampleBuffer, RPVideoSampleOrientationKeyRef,NULL);
if (orientation.integerValue == kCGImagePropertyOrientationUp ||
orientation.integerValue == kCGImagePropertyOrientationUpMirrored) {
targetOrientation = NTESVideoPackOrientationPortrait;
} else if(orientation.integerValue == kCGImagePropertyOrientationDown ||
orientation.integerValue == kCGImagePropertyOrientationDownMirrored) {
targetOrientation = NTESVideoPackOrientationPortraitUpsideDown;
} else if (orientation.integerValue == kCGImagePropertyOrientationLeft ||
orientation.integerValue == kCGImagePropertyOrientationLeftMirrored) {
targetOrientation = NTESVideoPackOrientationLandscapeLeft;
} else if (orientation.integerValue == kCGImagePropertyOrientationRight ||
orientation.integerValue == kCGImagePropertyOrientationRightMirrored) {
targetOrientation = NTESVideoPackOrientationLandscapeRight;
}
}
NTESI420Frame *videoFrame = [NTESYUVConverter pixelBufferToI420:pixelBuffer
withCrop:cropRate
targetSize:targetSize
andOrientation:targetOrientation];
NSDictionary *frame = @{
@"width": @(videoFrame.width),
@"height": @(videoFrame.height),
@"data": [videoFrame bytes],
@"timestamp": @(CACurrentMediaTime() * 1000)
};
[self.userDefautls setObject:frame forKey:@"frame"];
[self.userDefautls synchronize];
}
}
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
break;
default:
break;
}
}
在宿主 app
// APP Group 數據傳輸
- (void)setupUserDefaults
{
// 通過UserDefaults建立數據通道,接收Extension傳遞來的視頻幀
self.userDefaults = [[NSUserDefaults alloc] initWithSuiteName:kAppGroup];
}
// 監聽: 屏幕數據
- (void)addObserver
{
// KVO
[self.userDefaults addObserver:self forKeyPath:@"frame" options:NSKeyValueObservingOptionNew context:KVOContext];
}
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary<NSKeyValueChangeKey,id> *)change
context:(void *)context
{
if ([keyPath isEqualToString:@"frame"]) {
NSDictionary *i420Frame = change[NSKeyValueChangeNewKey];
NSData *data = i420Frame[@"data"];
NTESI420Frame *frame = [NTESI420Frame initWithData:data];
CMSampleBufferRef sampleBuffer = [frame convertToSampleBuffer];
if (sampleBuffer == NULL) {
return;
}
#warning 不需要解碼, 屏幕共享的數據, 編碼的同時解碼, 內存會暴漲, 這個只用來測試畫面
__weak typeof(self) weakSelf = self;
[self.h264code encodeSampleBuffer:sampleBuffer H264DataBlock:^(NSData * data) {
NSLog(@"%@", data);
// 正常情況應該去推流
}];
// 釋放對象
CFRelease(sampleBuffer);
}
}
- (void)dealloc
{
[self.userDefaults removeObserver:self forKeyPath:@"frame"];
}
總結:
以上就是 App Group 數據傳輸的方式了, 這兩種方式我寫了 2 個 Demo, Demo 還包含的解碼, 攝像頭采集, 渲染等進行了編解碼的測試
其中查了很多資料, 相關鏈接會放到最后供大家查看
Demo 我放在這里了, 想要看的話可以這里下載
Demo App Group 方式 https://github.com/summerxx27/ReplayKitShareScreen
Demo socket 方式 https://github.com/summerxx27/ReplayKitShareScreen-socket
文章參照
視頻流輸出方案
https://zhuanlan.zhihu.com/p/549325898
網易云信文檔
http://dev.yunxin.163.com/docs/product/音視頻通話1.0/SDK開發集成/iOS開發集成/屏幕共享
用ffmpeg來處理音視頻格式問題以及錄屏的裸數據轉mp4
http://www.lxweimin.com/p/41ea7e06c971
iOS ReplayKit 50M限制處理策略
http://www.lxweimin.com/p/8c25a3bbcb16
iOS 12 手動開啟錄屏直播
https://www.cnblogs.com/songliquan/p/15891392.html
編碼 demo
https://github.com/gezhaoyou/CaptureVideoDemo/tree/master
iOS ReplayKit 50M限制處理策略!
https://juejin.cn/post/6968738257123147807
編碼 videotoolbox
http://www.lxweimin.com/p/67d0dd931ed6
直播的基礎知識
https://www.cnblogs.com/junhuawang/p/7fe457786.html
Add support for publishing in background mode: VideoToolBox now supports background mode
https://github.com/shogo4405/HaishinKit.swift/issues/626
iOS音視頻開發八:視頻編碼,H.264 和 H.265 都支持
https://blog.csdn.net/m0_60259116/article/details/124804169
ios VideoToolbox 硬編碼 錯誤碼匯總
http://www.lxweimin.com/p/dce0a52e1bd6
騰訊云嗯的那個
https://cloud.tencent.com/developer/article/2021517
阿里云文檔
https://developer.aliyun.com/ask/64678?spm=a2c6h.13159736
比較詳細的屏幕擴展
http://www.lxweimin.com/p/bbe736e7b5eb
改變按鈕的樣式
http://kinoandworld.github.io/2021/07/20/RecordScreenLiveSummary/
iOS端屏幕錄制Replaykit項目實踐
http://www.lxweimin.com/p/392777d1995c
騰訊云屏幕共享