視頻采集
2A062F9A2B3ECFB9F380BA57338B91BA.png
0A629ECB7D473E837ADC2363F6033662.png
相關(guān)類的介紹:
- [ ] AVCaptureDevice:硬件設(shè)備,包括麥克風(fēng)、攝像頭,通過該對象可以設(shè)置物理設(shè)備的一些屬性(例如相機(jī)聚焦、白平衡等)
- [ ] AVCaptureDeviceInput:硬件輸入對象,可以根據(jù)AVCaptureDevice創(chuàng)建對應(yīng)的AVCaptureDeviceInput對象,用于管理硬件輸入數(shù)據(jù)
- [ ] AVCaptureOutput:硬件輸出對象,用于接收各類輸出數(shù)據(jù),通常使用對應(yīng)的子類AVCaptureAudioDataOutput(聲音數(shù)據(jù)輸出對象)、AVCaptureVideoDataOutput(視頻數(shù)據(jù)輸出對象)
- [ ] AVCaptionConnection:當(dāng)把一個(gè)輸入和輸出添加到AVCaptureSession之后,AVCaptureSession就會(huì)在輸入、輸出設(shè)備之間建立連接,而且通過AVCaptureOutput可以獲取這個(gè)連接對象
- [ ] AVCaptureVideoPreviewLayer:相機(jī)拍攝預(yù)覽圖層,能實(shí)時(shí)查看拍照或視頻錄制效果,創(chuàng)建該對象需要指定對應(yīng)的AVCaptureSession對象,因?yàn)锳VCaptureSession包含視頻輸入數(shù)據(jù),有視頻數(shù)據(jù)才能展示
- [ ] AVCaptureSession: 協(xié)調(diào)輸入與輸出之間傳輸數(shù)據(jù)系統(tǒng)作用:可以操作硬件設(shè)備
工作原理:讓App與系統(tǒng)之間產(chǎn)生一個(gè)捕獲會(huì)話,相當(dāng)于App與硬件設(shè)備有聯(lián)系了, 我們只需要把硬件輸入對象和輸出對象添加到會(huì)話中,會(huì)話就會(huì)自動(dòng)把硬件輸入對象和輸出產(chǎn)生連接,這樣硬件輸入與輸出設(shè)備就能傳輸音視頻數(shù)據(jù)。
采集視頻具體步驟:
需要的屬性
<AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate>
@property(nonatomic,strong) AVCaptureSession *captureSession; // AVCaptureSession
@property(nonatomic,strong) AVCaptureDeviceInput *currentVideoDeviceInput ; // AVCaptureDeviceInput
@property (nonatomic, weak) AVCaptureConnection *videoConnection;
@property (nonatomic, weak) AVCaptureVideoPreviewLayer *previedLayer;
.h
/**
開啟相機(jī)
*/
- (void)startRunning;
@property(nonatomic,copy) void(^sampleBufferCallBack)(CMSampleBufferRef sampleBuffer); // 毀掉數(shù)據(jù)
.m
// 創(chuàng)建session
- (void)setupSession
{
// 1, 創(chuàng)建步驟繪畫,設(shè)置分辨率
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
self.captureSession = captureSession;
captureSession.sessionPreset = AVCaptureSessionPreset1280x720; // 設(shè)置分辨率
// 2.獲取攝像頭設(shè)備,默認(rèn)是后置攝像頭
AVCaptureDevice *videoDevice = [self getVideoDevice:AVCaptureDevicePositionFront];
// 3, 獲取聲音設(shè)備
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
// 4.創(chuàng)建對應(yīng)視頻設(shè)備輸入對象
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
_currentVideoDeviceInput = videoDeviceInput;
// 5.創(chuàng)建對應(yīng)音頻設(shè)備輸入對象
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
// 6.添加到會(huì)話中
// 注意“最好要判斷是否能添加輸入,會(huì)話不能添加空的
// 6.1 添加視頻
if ([captureSession canAddInput:videoDeviceInput])
{
[captureSession addInput:videoDeviceInput];
}
// 6.2 添加音頻
if ([captureSession canAddInput:audioDeviceInput])
{
[captureSession addInput:audioDeviceInput];
}
// 7.獲取視頻數(shù)據(jù)輸出設(shè)備
AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
// 7.1 設(shè)置代理,捕獲視頻樣品數(shù)據(jù)
// 注意:隊(duì)列必須是串行隊(duì)列,才能獲取到數(shù)據(jù),而且不能為空
dispatch_queue_t videoQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
[videoOutput setSampleBufferDelegate:self queue:videoQueue]; // 可以獲取視頻的幀數(shù)據(jù)
if ([captureSession canAddOutput:videoOutput])
{
[captureSession addOutput:videoOutput];
}
// 8.獲取音頻數(shù)據(jù)輸出設(shè)備
AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init];
// 8.2 設(shè)置代理,捕獲視頻樣品數(shù)據(jù)
// 注意:隊(duì)列必須是串行隊(duì)列,才能獲取到數(shù)據(jù),而且不能為空
dispatch_queue_t audioQueue = dispatch_queue_create("Audio Capture Queue", DISPATCH_QUEUE_SERIAL);
[audioOutput setSampleBufferDelegate:self queue:audioQueue];
if ([captureSession canAddOutput:audioOutput])
{
[captureSession addOutput:audioOutput];
}
// 9.獲取視頻輸入與輸出連接,用于分辨音視頻數(shù)據(jù)
self.videoConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
self.videoConnection.videoOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
}
- (void)startRunning
{
//10 開啟繪畫
[_captureSession startRunning];
}
// 指定攝像頭方向獲取攝像頭
- (AVCaptureDevice *)getVideoDevice:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices)
{
if (device.position == position)
{
return device;
}
}
return nil;
}
#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate
// 獲取輸入設(shè)備數(shù)據(jù),有可能是音頻有可能是視頻
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if (_videoConnection == connection)
{
if (self.sampleBufferCallBack)
{
self.sampleBufferCallBack(sampleBuffer);
}
}
else
{
}
}
結(jié)合之前的opengl的渲染綜合使用
-(void)setupOpengl
{
self.ijsVideo =[[IJSOpenGLView alloc]initWithFrame:[UIScreen mainScreen].bounds];
IJSVideoCamera *camera = [[IJSVideoCamera alloc]init];
self.camera = camera;
__weak typeof (self) weakSelf = self;
camera.sampleBufferCallBack = ^(CMSampleBufferRef sampleBuffer) {
[weakSelf.ijsVideo displayFramebuffer:sampleBuffer];
};
self.view = self.ijsVideo;
[self.camera startRunning];
}