前言
AVFoundation框架是iOS中很重要的框架,所有與音視頻相關的軟硬件控制都在這個框架里。本文主要介紹iOS媒體捕捉和視頻采集。
媒體捕捉流程
媒體捕捉(網侵刪).png
簡單介紹
- AVCaptureSession:媒體捕獲會話(包括音頻和視頻),負責把捕獲的音視頻數據輸出到輸出設備中,一個AVCaptureSession可以有多個輸入輸出。
在視頻或音頻捕捉時,客戶端可以實例AVCaptureSession,添加適當的AVCaptureInputs、AVCaptureDeviceInput和輸出。 - AVCaptureInput和AVCaptureDevice:設備輸入數據管理對象,可以根據AVCaptureDevice創建對應AVCaptureDeviceInput對象,該對象將會被添加到AVCaptureSession中管理。
- AVCaptureOutput:設備輸出數據管理對象。
- AVCaptureVideoPreviewLayer和AVSampleBufferDisplayLayer,相機拍攝預覽圖層,是CALayer的子類,前者創建需要AVCaptureSession對象,后者可以直接創建,添加CMSampleBufferRef進行展示。
相關代碼展示
- (void)configureCamera{
/// 參數設置
// 默認后置攝像頭
AVCaptureDevicePosition position = AVCaptureDevicePositionBack;
// 幀率
int frameRate = 25;
// 顯色方案
OSType videoFormat = kCVPixelFormatType_32BGRA;
// 分辨率高
int resolutionHeight = 720;
/// 創建AVCaptureSession對象
AVCaptureSession *session = [[AVCaptureSession alloc] init];
/// 設置分辨率
session.sessionPreset = AVCaptureSessionPreset1280x720;
/// 獲取攝像頭
AVCaptureDevice *captureDevice;
// 默認AVCaptureDevicePositionBack,后置攝像頭
AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
NSArray *devices = deviceDiscoverySession.devices;
for (AVCaptureDevice *device in devices) {
if (AVCaptureDevicePositionBack == device.position) {
captureDevice = device;
}else if (AVCaptureDevicePositionFront == device.position){
captureDevice = device;
}
}
/// 設置幀率和分辨率高度
BOOL isSuccess = NO;
for(AVCaptureDeviceFormat *vFormat in [captureDevice formats]) {
CMFormatDescriptionRef description = vFormat.formatDescription;
float maxRate = ((AVFrameRateRange*) [vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
if (maxRate >= frameRate && CMFormatDescriptionGetMediaSubType(description) == videoFormat) {
if ([captureDevice lockForConfiguration:NULL] == YES) {
// 對比鏡頭支持的分辨率和當前設置的分辨率
CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(description);
if (dims.height == resolutionHeight && dims.width == [self.class getResolutionWidthByHeight:resolutionHeight]) {
[session beginConfiguration];
if ([captureDevice lockForConfiguration:NULL]){
captureDevice.activeFormat = vFormat;
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice unlockForConfiguration];
}
[session commitConfiguration];
isSuccess = YES;
}
}else {
NSLog(@"%s: 失敗",__func__);
}
}
}
NSError *error;
//添加輸入
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (error != noErr) {
NSLog(@"配置設備輸入失敗:%@",error.localizedDescription);
return;
}
[session addInput:input];
//添加輸出
AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
if ([session canAddOutput:videoDataOutput]) {
[session addOutput:videoDataOutput];
}
videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:videoFormat]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
//當此屬性的值為YES時,接收方將立即丟棄捕獲的幀,而處理現有幀的調度隊列在captureOutput:didOutputSampleBuffer:fromConnection: delegate方法中被阻塞。當此屬性的值為NO時,將允許委托在丟棄新幀之前有更多的時間處理舊幀,但應用程序的內存使用量可能會顯著增加。默認值為“YES”。(機翻)
videoDataOutput.alwaysDiscardsLateVideoFrames = NO;
//創建一個隊列接收數據
dispatch_queue_t videoQueue = dispatch_queue_create("video_receive_queue", NULL);
[videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
//創建接收對象
AVCaptureVideoPreviewLayer *videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
videoPreviewLayer.backgroundColor = [[UIColor blackColor] CGColor];
CGRect frame = [videoPreviewLayer bounds];
NSLog(@"previewViewLayer = %@",NSStringFromCGRect(frame));
//設置尺寸和填充方式
[videoPreviewLayer setFrame:[UIScreen mainScreen].bounds];
[videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
if ([[videoPreviewLayer connection] isVideoOrientationSupported]) {
[videoPreviewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
}else{
NSLog(@"不支持視頻定向");
}
//需要在哪個view上展示
UIView *showView = [[UIView alloc] init];
[showView.layer insertSublayer:videoPreviewLayer atIndex:0];
}
//采集視頻的回調,如果需要編碼H264/H265,在這里操作
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
/*
//另一種展示方式
AVSampleBufferDisplayLayer *previewLayer = [AVSampleBufferDisplayLayer layer];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[previewLayer enqueueSampleBuffer:sampleBuffer];
//需要在哪個view上展示
UIView *showView = [[UIView alloc] init];
[showView.layer insertSublayer:previewLayer atIndex:0];
*/
}
///需要考慮橫豎屏情況,這里暫未考慮
+ (int)getResolutionWidthByHeight:(int)height {
switch (height) {
case 2160:
return 3840;
case 1080:
return 1920;
case 720:
return 1280;
case 480:
return 640;
default:
return -1;
}
}
Tips:配置采集之前,記得申請攝像頭權限,如果沒有權限,需要自己做判斷,這里省略。
Demo地址整理后奉上。
有其他不明白的,可以留言,看到就會回復。
如果喜歡,請幫忙點贊。支持轉載,轉載請附原文鏈接。