為了完成實(shí)時(shí)的捕獲,首先初始化一個(gè)AVCaputureSession對(duì)象用于創(chuàng)建一個(gè)捕獲會(huì)話(session),我們可以使用AVCaptureSession對(duì)象將AV輸入設(shè)備的數(shù)據(jù)流以另一種形式轉(zhuǎn)換到輸出。
然后,我們初始化一個(gè)AVCaptureDeviceInput對(duì)象,以創(chuàng)建一個(gè)輸入數(shù)據(jù)源,該數(shù)據(jù)源為捕獲會(huì)話(session)提供視頻數(shù)據(jù),再調(diào)用addInput方法將創(chuàng)建的輸入添加到AVCaptureSession對(duì)象。
接著初始化一個(gè)AVCaptureVideoDataOuput對(duì)象,以創(chuàng)建一個(gè)輸出目標(biāo),然后調(diào)用addOutput方法將該對(duì)象添加到捕獲會(huì)話中。
AVCaptureVideoDataOutput可用于處理從視頻中捕獲的未經(jīng)壓縮的幀。一個(gè)AVCaptureVideoDataOutput實(shí)例能處理許多其他多媒體API能處理的視頻幀,你可以通過captureOutput:didOutputSampleBuffer:fromConnection:這個(gè)委托方法獲取幀,使用setSampleBufferDelegate:queue:設(shè)置抽樣緩存委托和將應(yīng)用回調(diào)的隊(duì)列。AVCaptureVideoDataOutputSampleBuffer對(duì)象的委托必須采用AVCaptureVideoDataOutputSampleBufferDelegate協(xié)議,使用sessionPreset協(xié)議來制定輸出品質(zhì)。
我們可以通過調(diào)用捕獲會(huì)話的startRunning方法啟動(dòng)從輸入到輸出的數(shù)據(jù)流,通過stopRunning方法來停止數(shù)據(jù)流。
列表1給出了一個(gè)例子。setupCaptureSession創(chuàng)建了一個(gè)捕獲會(huì)話,添加了一個(gè)視頻輸入提供提視頻幀,一個(gè)輸出目標(biāo)獲取捕獲的幀,然后啟動(dòng)從輸入到輸出的數(shù)據(jù)流。當(dāng)捕獲會(huì)話正在運(yùn)行時(shí),使用captureOut:didOutputSampleBuffer:fromConnection方法將被捕獲的視頻抽樣幀發(fā)送給抽樣緩存委托,然后每個(gè)抽樣緩存(CMSampleBufferRef)被轉(zhuǎn)換成imageFromSampleBuffer中的一個(gè)UIImage對(duì)象。
列表1:使用AV Foundation設(shè)置一個(gè)捕獲設(shè)備錄制視頻并將是視頻幀保存為UIImage對(duì)象。
// 創(chuàng)建并配置一個(gè)捕獲會(huì)話并且啟用它
- (void)setupCaptureSession
{
NSError *error = nil;
// 創(chuàng)建session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
// 可以配置session以產(chǎn)生解析度較低的視頻幀,如果你的處理算法能夠應(yīng)付(這種低解析度)。
// 我們將選擇的設(shè)備指定為中等質(zhì)量。
session.sessionPreset = AVCaptureSessionPresetMedium;
// 找到一個(gè)合適的AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
// 用device對(duì)象創(chuàng)建一個(gè)設(shè)備對(duì)象input,并將其添加到session
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// 處理相應(yīng)的錯(cuò)誤
}
[session addInput:input];
// 創(chuàng)建一個(gè)VideoDataOutput對(duì)象,將其添加到session
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[session addOutput:output];
// 配置output對(duì)象
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
// 指定像素格式
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// 如果你想將視頻的幀數(shù)指定一個(gè)頂值, 例如15ps
// 可以設(shè)置minFrameDuration(該屬性在iOS 5.0中棄用)
output.minFrameDuration = CMTimeMake(1, 15);
// 啟動(dòng)session以啟動(dòng)數(shù)據(jù)流
[session startRunning];
// 將session附給[實(shí)例變量](https://www.baidu.com/s?wd=%E5%AE%9E%E4%BE%8B%E5%8F%98%E9%87%8F&tn=44039180_cpr&fenlei=mv6quAkxTZn0IZRqIHckPjm4nH00T1YkP10znvmvnWPWuADvmyw90ZwV5Hcvrjm3rH6sPfKWUMw85HfYnjn4nH6sgvPsT6KdThsqpZwYTjCEQLGCpyw9Uz4Bmy-bIi4WUvYETgN-TLwGUv3EnH64PHTsrHb4PHRvn1TkPjDYn0)
[self setSession:session];
}
// 抽樣緩存寫入時(shí)所調(diào)用的委托程序
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// 通過抽樣緩存數(shù)據(jù)創(chuàng)建一個(gè)UIImage對(duì)象
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
< 此處添加使用該image對(duì)象的代碼 >
}
// 通過抽樣緩存數(shù)據(jù)創(chuàng)建一個(gè)UIImage對(duì)象
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
// 為媒體數(shù)據(jù)設(shè)置一個(gè)CMSampleBuffer的Core Video圖像緩存對(duì)象
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// 鎖定pixel buffer的基地址
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// 得到pixel buffer的基地址
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// 得到pixel buffer的行字節(jié)數(shù)
[size_t](https://www.baidu.com/s?wd=size_t&tn=44039180_cpr&fenlei=mv6quAkxTZn0IZRqIHckPjm4nH00T1YkP10znvmvnWPWuADvmyw90ZwV5Hcvrjm3rH6sPfKWUMw85HfYnjn4nH6sgvPsT6KdThsqpZwYTjCEQLGCpyw9Uz4Bmy-bIi4WUvYETgN-TLwGUv3EnH64PHTsrHb4PHRvn1TkPjDYn0)bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// 得到pixel buffer的寬和高
[size_t](https://www.baidu.com/s?wd=size_t&tn=44039180_cpr&fenlei=mv6quAkxTZn0IZRqIHckPjm4nH00T1YkP10znvmvnWPWuADvmyw90ZwV5Hcvrjm3rH6sPfKWUMw85HfYnjn4nH6sgvPsT6KdThsqpZwYTjCEQLGCpyw9Uz4Bmy-bIi4WUvYETgN-TLwGUv3EnH64PHTsrHb4PHRvn1TkPjDYn0)width = CVPixelBufferGetWidth(imageBuffer);
[size_t](https://www.baidu.com/s?wd=size_t&tn=44039180_cpr&fenlei=mv6quAkxTZn0IZRqIHckPjm4nH00T1YkP10znvmvnWPWuADvmyw90ZwV5Hcvrjm3rH6sPfKWUMw85HfYnjn4nH6sgvPsT6KdThsqpZwYTjCEQLGCpyw9Uz4Bmy-bIi4WUvYETgN-TLwGUv3EnH64PHTsrHb4PHRvn1TkPjDYn0)height = CVPixelBufferGetHeight(imageBuffer);
// 創(chuàng)建一個(gè)依賴于設(shè)備的RGB[顏色空間](https://www.baidu.com/s?wd=%E9%A2%9C%E8%89%B2%E7%A9%BA%E9%97%B4&tn=44039180_cpr&fenlei=mv6quAkxTZn0IZRqIHckPjm4nH00T1YkP10znvmvnWPWuADvmyw90ZwV5Hcvrjm3rH6sPfKWUMw85HfYnjn4nH6sgvPsT6KdThsqpZwYTjCEQLGCpyw9Uz4Bmy-bIi4WUvYETgN-TLwGUv3EnH64PHTsrHb4PHRvn1TkPjDYn0)
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// 用抽樣緩存的數(shù)據(jù)創(chuàng)建一個(gè)位圖格式的圖形上下文(graphics context)對(duì)象
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// 根據(jù)這個(gè)位圖context中的像素?cái)?shù)據(jù)創(chuàng)建一個(gè)Quartz image對(duì)象
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// 解鎖pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// 釋放context和[顏色空間](https://www.baidu.com/s?wd=%E9%A2%9C%E8%89%B2%E7%A9%BA%E9%97%B4&tn=44039180_cpr&fenlei=mv6quAkxTZn0IZRqIHckPjm4nH00T1YkP10znvmvnWPWuADvmyw90ZwV5Hcvrjm3rH6sPfKWUMw85HfYnjn4nH6sgvPsT6KdThsqpZwYTjCEQLGCpyw9Uz4Bmy-bIi4WUvYETgN-TLwGUv3EnH64PHTsrHb4PHRvn1TkPjDYn0)
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// 用Quartz image創(chuàng)建一個(gè)UIImage對(duì)象image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// 釋放Quartz image對(duì)象
CGImageRelease(quartzImage);
return (image);
}```