AVFoundation 媒體捕捉

1 媒體捕捉

1.1 媒體捕捉類簡介

  • AVCaptureSession: AVFoundation核心類,它管理從物理設(shè)備得到的信號流,可以為它配置一個預(yù)設(shè)值用來控制捕捉數(shù)據(jù)格式和質(zhì)量。
  • AVCaptureDevice:為攝像頭和麥克風(fēng)等物理設(shè)備定義的接口,其中針對物理硬件定義了大量的控制方法。比如攝像頭的對焦、曝光、白平衡和閃光等。
  • AVCaptureDeviceInput:對物理設(shè)備的AVCaptureDevice封裝,只用包裝成該對象的實(shí)例才能添加到AVCaptureSession中。
  • AVCaptureOutput:該類是一個抽象類,具體功能子類是...StillImageOutput等,它們用于輸出捕捉到的媒體資源。...DataOutput子類用于對視頻流和音頻流進(jìn)行實(shí)時處理。
  • AVCaptureConnection:用于連接輸出和輸入端,即圖中箭頭,可以對信號流的底層控制,如禁用某些特定連接,或在音頻連接中訪問單獨(dú)的音頻軌道。
  • AVCaptureVideoPreviewLayer:用于渲染當(dāng)前捕捉到的視頻媒體,可以通過設(shè)置gravity屬性設(shè)置圖像拉伸方式。

1.2 圖像捕捉

1.2.1 初始化相機(jī)

首先需要創(chuàng)建THPreviewView.h來負(fù)責(zé)捕捉到圖像的預(yù)覽以及所有的手勢交互邏輯。

#import <AVFoundation/AVFoundation.h>

// 手勢交互代理方法
@protocol THPreviewViewDelegate <NSObject>
- (void)tappedToFocusAtPoint:(CGPoint)point;
- (void)tappedToExposeAtPoint:(CGPoint)point;
- (void)tappedToResetFocusAndExposure;
@end

@interface THPreviewView : UIView
@property (strong, nonatomic) AVCaptureSession *session;
@property (weak, nonatomic) id<THPreviewViewDelegate> delegate;

@property (nonatomic) BOOL tapToFocusEnabled;
@property (nonatomic) BOOL tapToExposeEnabled;
@end

THPreviewView.m

@interface THPreviewView ()
@property (strong, nonatomic) UIView *focusBox;
@property (strong, nonatomic) UIView *exposureBox;
@property (strong, nonatomic) NSTimer *timer;
@property (strong, nonatomic) UITapGestureRecognizer *singleTapRecognizer;
@property (strong, nonatomic) UITapGestureRecognizer *doubleTapRecognizer;
@property (strong, nonatomic) UITapGestureRecognizer *doubleDoubleTapRecognizer;
@end

@implementation THPreviewView

- (id)initWithFrame:(CGRect)frame {
    self = [super initWithFrame:frame];
    if (self) {
        [self setupView];
    }
    return self;
}

+ (Class)layerClass {
    return [AVCaptureVideoPreviewLayer class];
}

- (AVCaptureSession*)session {
    return [(AVCaptureVideoPreviewLayer*)self.layer session];
}

- (void)setSession:(AVCaptureSession *)session {
    [(AVCaptureVideoPreviewLayer*)self.layer setSession:session];
}

// 將屏幕坐標(biāo)系(像素點(diǎn)為基準(zhǔn))轉(zhuǎn)換為捕捉設(shè)備坐標(biāo)系(home鍵位于右側(cè)時左上0,0  右下1,1)
- (CGPoint)captureDevicePointForPoint:(CGPoint)point {
    AVCaptureVideoPreviewLayer *layer =
    (AVCaptureVideoPreviewLayer *)self.layer;
    return [layer captureDevicePointOfInterestForPoint:point];
}
@end

創(chuàng)建THCameraController.h作為相機(jī)管理類,負(fù)責(zé)和相機(jī)相關(guān)的具體邏輯實(shí)現(xiàn)。

extern NSString *const THThumbnailCreatedNotification;

// 定義錯誤事件發(fā)生時的調(diào)用
@protocol THCameraControllerDelegate <NSObject>
- (void)deviceConfigurationFailedWithError:(NSError *)error;
- (void)mediaCaptureFailedWithError:(NSError *)error;
- (void)assetLibraryWriteFailedWithError:(NSError *)error;
@end

@interface THCameraController : NSObject
@property (weak, nonatomic) id<THCameraControllerDelegate> delegate;
@property (nonatomic, strong, readonly) AVCaptureSession *captureSession;

// Session Configuration 用于配置和捕捉會話
- (BOOL)setupSession:(NSError **)error;
- (void)startSession;
- (void)stopSession;

// Camera Device Support 切換攝像頭,判斷并開啟閃光、手電筒功能,判斷是否支持對焦和曝光功能
- (BOOL)switchCameras;
- (BOOL)canSwitchCameras;
@property (nonatomic, readonly) NSUInteger cameraCount;
@property (nonatomic, readonly) BOOL cameraHasTorch;
@property (nonatomic, readonly) BOOL cameraHasFlash;
@property (nonatomic, readonly) BOOL cameraSupportsTapToFocus;
@property (nonatomic, readonly) BOOL cameraSupportsTapToExpose;
@property (nonatomic) AVCaptureTorchMode torchMode;
@property (nonatomic) AVCaptureFlashMode flashMode;

// Tap to * Methods 開啟聚焦、曝光功能
- (void)focusAtPoint:(CGPoint)point;
- (void)exposeAtPoint:(CGPoint)point;
- (void)resetFocusAndExposureModes;

/** Media Capture Methods 捕捉靜態(tài)圖片和視頻**/
// Still Image Capture
- (void)captureStillImage;

// Video Recording
- (void)startRecording;
- (void)stopRecording;
- (BOOL)isRecording;
- (CMTime)recordedDuration;
@end

THCameraController.m實(shí)現(xiàn)圖像和聲音捕捉邏輯。

- (BOOL)setupSession:(NSError **)error {
    self.captureSession = [[AVCaptureSession alloc] init];
    self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;
    
    // 設(shè)置默認(rèn)相機(jī)設(shè)備輸入
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:error];
    if (videoInput) {
        if ([self.captureSession canAddInput:videoInput]) {
            [self.captureSession addInput:videoInput];
            self.activeVideoInput = videoInput;
        }
    } else {
        return NO;
    }
    
    // 設(shè)置默認(rèn)麥克風(fēng)設(shè)備
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:error];
    if (audioInput) {
        if ([self.captureSession canAddInput:audioInput]) {
            [self.captureSession addInput:audioInput];
        }
    } else {
        return NO;
    }
    
    // 設(shè)置靜態(tài)圖像輸出
    self.imageOutput = [[AVCaptureStillImageOutput alloc] init];
    self.imageOutput.outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG};
    if ([self.captureSession canAddOutput:self.imageOutput]) {
        [self.captureSession addOutput:self.imageOutput];
    }
    
    // 設(shè)置視頻輸出
    self.movieOutput = [[AVCaptureMovieFileOutput alloc] init];
    if ([self.captureSession canAddOutput:self.movieOutput]) {
        [self.captureSession addOutput:self.movieOutput];
    }
    self.videoQueue = dispatch_queue_create("com.tapharmonic.VideoQueue", DISPATCH_QUEUE_SERIAL);
    return YES;
}

// 會話的開始和終止是耗時操作,因此需放在一個隊(duì)列中異步執(zhí)行
- (void)startSession {
    if (![self.captureSession isRunning]) {
        dispatch_async(self.videoQueue, ^{
            [self.captureSession startRunning];
        });
    }
}

- (void)stopSession {
    if ([self.captureSession isRunning]) {
        dispatch_async(self.videoQueue, ^{
            [self.captureSession stopRunning];
        });
    }
}

創(chuàng)建THViewController實(shí)現(xiàn)相機(jī)視圖控制器,管理預(yù)覽視圖Preview和相機(jī)CameraController。

- (void)viewDidLoad {
    [super viewDidLoad];
    [[NSNotificationCenter defaultCenter] addObserver:self
                                             selector:@selector(updateThumbnail:)
                                                 name:THThumbnailCreatedNotification
                                               object:nil];
    self.cameraMode = THCameraModeVideo;
    self.cameraController = [[THCameraController alloc] init];
    
    NSError *error;
    if ([self.cameraController setupSession:&error]) {
        [self.previewView setSession:self.cameraController.captureSession];
        self.previewView.delegate = self;
        [self.cameraController startSession];
    } else {
        NSLog(@"Error: %@", [error localizedDescription]);
    }
    
    self.previewView.tapToFocusEnabled = self.cameraController.cameraSupportsTapToFocus;
    self.previewView.tapToExposeEnabled = self.cameraController.cameraSupportsTapToExpose;  
}

最后在項(xiàng)目的InfoPlist中添加key對相機(jī)和麥克風(fēng)授權(quán),相機(jī)初始化功能完成。

1.2.2 切換攝像頭

切換攝像頭工具方法實(shí)現(xiàn)

- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position {
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if (device.position == position) {
            return device;
        }
    }
    return nil;
}

- (AVCaptureDevice *)activeCamera {
    return self.activeVideoInput.device;
}

- (AVCaptureDevice *)inactiveCamera {
    AVCaptureDevice *device = nil;
    if (self.cameraCount > 1) {
        if ([self activeCamera].position == AVCaptureDevicePositionBack) {
            device = [self cameraWithPosition:AVCaptureDevicePositionFront];
        } else {
            device = [self cameraWithPosition:AVCaptureDevicePositionBack];
        }
    }
    return device;
}

- (BOOL)canSwitchCameras {
    return self.cameraCount > 1;
}

- (NSUInteger)cameraCount {
    return [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count];
}

切換攝像頭核心方法實(shí)現(xiàn)

- (BOOL)switchCameras {
    if (![self canSwitchCameras]) {
        return NO;
    }
    
    NSError *error;
    AVCaptureDevice *videoDevice = [self inactiveCamera];
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    
    if (videoInput) {
        [self.captureSession beginConfiguration];
        [self.captureSession removeInput:self.activeVideoInput];
        if ([self.captureSession canAddInput:videoInput]) {
            [self.captureSession addInput:videoInput];
            self.activeVideoInput = videoInput;
        } else {
            [self.captureSession addInput:self.activeVideoInput];
        }
        [self.captureSession commitConfiguration];
    } else {
        [self.delegate deviceConfigurationFailedWithError:error];
        return NO;
    }
    return YES;
}
1.2.3 配置捕捉設(shè)備

AVCaptureDevice定義了很多方法可以控制攝像頭的焦距、曝光、閃光和白平衡。實(shí)現(xiàn)的邏輯遵循:1)查詢是否支持某項(xiàng)配置,2)鎖定配置,3)執(zhí)行配置,4)解鎖配置4個步驟進(jìn)行。

1.2.4 調(diào)整焦距和曝光

通常,iOS大多數(shù)設(shè)備都支持基于興趣點(diǎn)的曝光和聚焦,并且默認(rèn)都是基于屏幕中心點(diǎn)默認(rèn)連續(xù)自動對焦和曝光。在實(shí)際應(yīng)用中,可能我們需要時設(shè)備對于視圖中心以外其他部分的興趣點(diǎn)鎖定對焦和曝光。
鎖定焦距

#pragma mark - Focus Methods
- (BOOL)cameraSupportsTapToFocus {
    return [[self activeCamera] isFocusPointOfInterestSupported];
}

- (void)focusAtPoint:(CGPoint)point {
    AVCaptureDevice *device = [self activeCamera];
    if (device.isFocusPointOfInterestSupported && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            // 此處的點(diǎn)已經(jīng)從屏幕坐標(biāo)系轉(zhuǎn)換為設(shè)備坐標(biāo)系,在Preview中轉(zhuǎn)換
            device.focusPointOfInterest = point;
            device.focusMode = AVCaptureFocusModeAutoFocus;
            [device unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}

鎖定曝光

#pragma mark - Exposure Methods
- (BOOL)cameraSupportsTapToExpose {
    return [[self activeCamera] isExposurePointOfInterestSupported];
}

static const NSString *THCameraAdjustingExposureContext;
- (void)exposeAtPoint:(CGPoint)point {
    // 這里使用持續(xù)調(diào)整曝光模式,并且通過KVO監(jiān)視攝像頭,在調(diào)整好曝光模式后鎖定曝光等級,持續(xù)曝光模式
    // 能更精細(xì)控制曝光等級。盡管對焦模式中持續(xù)自動對焦也能更精細(xì)控制,但是對于對焦一次對焦就可以滿足大多數(shù)需要。
    AVCaptureDevice *device = [self activeCamera];
    if (device.isExposurePointOfInterestSupported && [device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            device.exposurePointOfInterest = point;
            device.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
            
            if ([device isExposureModeSupported:AVCaptureExposureModeLocked]) {
                [device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:&THCameraAdjustingExposureContext];
            }
            [device unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}

- (void)observeValueForKeyPath:(NSString *)keyPath
                      ofObject:(id)object
                        change:(NSDictionary *)change
                       context:(void *)context {
    if (context == &THCameraAdjustingExposureContext) {
        AVCaptureDevice *device = (AVCaptureDevice *)object;
        // 此處由于AVCaptureExposureModeContinuousAutoExposure會持續(xù)調(diào)整某點(diǎn)的曝光等級,因此會
        // 多次修改isAdjustingExposure屬性,因此只有修改為不再調(diào)整時候移除通知。只有支持
        // AVCaptureExposureModeLocked才會添加該通知,因此在滿足該條件時候移除通知會移除干凈。
        if (!device.isAdjustingExposure && [device isExposureModeSupported:AVCaptureExposureModeLocked]) {
            [object removeObserver:self forKeyPath:@"adjustingExposure" context:&THCameraAdjustingExposureContext];
            
            dispatch_async(dispatch_get_main_queue(), ^{
                NSError *error;
                if ([device lockForConfiguration:&error]) {
                    device.exposureMode = AVCaptureExposureModeLocked;
                    [device unlockForConfiguration];
                } else {
                    [self.delegate deviceConfigurationFailedWithError:error];
                }
            });
        }
    } else {
        [super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
    }
}

恢復(fù)以屏幕中心自動連續(xù)對焦和曝光

- (void)resetFocusAndExposureModes {
    AVCaptureDevice *device = [self activeCamera];
    AVCaptureFocusMode focusMode = AVCaptureFocusModeContinuousAutoFocus;
    BOOL canResetFocus = [device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode];
    AVCaptureExposureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
    BOOL canResetExposure = [device isExposurePointOfInterestSupported] && [device isExposureModeSupported:exposureMode];
    
    CGPoint centerPoint = CGPointMake(0.5f, 0.5f);
    NSError *error;
    if ([device lockForConfiguration:&error]) {
        if (canResetFocus) {
            device.focusMode = focusMode;
            device.focusPointOfInterest = centerPoint;
        }
        
        if (canResetExposure) {
            device.exposureMode = exposureMode;
            device.exposurePointOfInterest = centerPoint;
        }
        [device unlockForConfiguration];
    } else {
        [self.delegate deviceConfigurationFailedWithError:error];
    }
}
1.2.5 調(diào)整閃光燈和手電筒

蘋果將后面的LED燈用于拍照模式下閃光(flash)和攝像模式下的手電筒(torch)功能。在拍照模式下也可以通過設(shè)置torch使用手電筒功能。

#pragma mark - Flash and Torch Modes
- (BOOL)cameraHasFlash {
    return [[self activeCamera] hasFlash];
}

- (AVCaptureFlashMode)flashMode {
    return [[self activeCamera] flashMode];
}

- (void)setFlashMode:(AVCaptureFlashMode)flashMode {
    AVCaptureDevice *device = [self activeCamera];
    if ([device isFlashModeSupported:flashMode]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            device.flashMode = flashMode;
            [device unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}

- (BOOL)cameraHasTorch {
    return [[self activeCamera] hasTorch];
}

- (AVCaptureTorchMode)torchMode {
    return [[self activeCamera] torchMode];
}

- (void)setTorchMode:(AVCaptureTorchMode)torchMode {
    AVCaptureDevice *device = [self activeCamera];
    if ([device isTorchModeSupported:torchMode]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            device.torchMode = torchMode;
            [device unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}
1.2.6 拍攝靜態(tài)照片

當(dāng)創(chuàng)建一個會話并添加捕捉設(shè)備輸入和輸出時,AVFoundation會自動建立輸入和輸出之間的鏈接,得到照片和視頻都需要先獲得正確的連接AVCaptureConnection。

處理照片時需要用到Core Media框架中的CMSampleBuffer,這里需要注意它是Core Foundation類型數(shù)據(jù),其不支持ARC,在iOS中手動建立的Core Foundation需要手動釋放內(nèi)存,系統(tǒng)block中的Core Foundation數(shù)據(jù)并不需要,推測系統(tǒng)在調(diào)用block后會自動處理。

創(chuàng)建靜態(tài)圖片輸出時可以指定outputSettings設(shè)置輸出的圖片格式,這樣對應(yīng)的信號流可以被壓縮成指定格式的二進(jìn)制數(shù)據(jù)。

將視頻和圖片資源存入系統(tǒng)相冊時需要使用PHotos框架。

#pragma mark - Image Capture Methods
- (void)captureStillImage {
    AVCaptureConnection *connection = [self.imageOutput connectionWithMediaType:AVMediaTypeVideo];
    if (connection.isVideoOrientationSupported) {
        connection.videoOrientation = [self currentVideoOrientation];
    }
    
    id handler = ^(CMSampleBufferRef sampleBuffer, NSError *error) {
        if (sampleBuffer != NULL) {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:sampleBuffer];
            UIImage *image = [[UIImage alloc] initWithData:imageData];
            [self writeImageToPhotosLibrary:image];
        } else {
            NSLog(@"NULL sampleBuffer: %@", [error localizedDescription]);
        }
    };
    // 捕獲靜態(tài)圖像
    [self.imageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:handler];
}

- (AVCaptureVideoOrientation)currentVideoOrientation {
    AVCaptureVideoOrientation orientation;
    switch ([UIDevice currentDevice].orientation) {
        // 相機(jī)的左右和設(shè)備的左右是相反的
        case UIDeviceOrientationLandscapeRight:
            orientation = AVCaptureVideoOrientationLandscapeLeft;
            break;
        case UIDeviceOrientationLandscapeLeft:
            orientation = AVCaptureVideoOrientationLandscapeRight;
            break;
        case UIDeviceOrientationPortraitUpsideDown:
            orientation = AVCaptureVideoOrientationPortraitUpsideDown;
            break;
        case UIDeviceOrientationPortrait:
        default:
            orientation = AVCaptureVideoOrientationPortrait;
            break;
    }
    return orientation;
}

- (void)writeImageToPhotosLibrary:(UIImage *)image {
    NSError *error = nil;
    __block PHObjectPlaceholder *createdAsset = nil;
    [[PHPhotoLibrary sharedPhotoLibrary] performChangesAndWait:^{
        createdAsset = [PHAssetCreationRequest creationRequestForAssetFromImage:image].placeholderForCreatedAsset;
    } error:&error];
    if (error || !createdAsset) {
        NSLog(@"Error: %@", [error localizedDescription]);
    } else {
        [self postThumbnailNotifification:image];
    }
}

- (void)postThumbnailNotifification:(UIImage *)image {
    NSNotificationCenter *nc = [NSNotificationCenter defaultCenter];
    [nc postNotificationName:THThumbnailCreatedNotification object:image];
}
1.2.7 拍攝視頻文件

生成視頻文件主要分為錄制數(shù)據(jù),封裝視頻文件兩個部分。在錄制數(shù)據(jù)中,為了防止電話接入等中斷事件,AVCaptureFileOutput采用分段捕捉功能,在寫入文件初會產(chǎn)生一個最小化頭信息,隨著錄制進(jìn)行,片段按照一定周期寫入,最后創(chuàng)建完整的頭信息。這樣當(dāng)發(fā)生中斷事件時,會以最后一個完整寫入的片段終點(diǎn)保存視頻文件。默認(rèn)的片段時長為10秒,可以通過movieFragmentInterval更改。封裝視頻文件時文件頭信息會被放在視頻數(shù)據(jù)之前。如下圖所示。

AVCaptureFileOutput可以設(shè)置將要寫入視頻文件的元數(shù)據(jù)。另外它還提供許多實(shí)用功能,例如可以設(shè)置錄制最長時間、錄制到特定的文件大小、保留最小的可用磁盤空間等。另外通過設(shè)置其OutputSetting可以設(shè)置編碼格式、碼率、色彩采樣方式等一大堆屬性,具體見其頭文件說明。通常不設(shè)置OutputSetting,系統(tǒng)會默認(rèn)根據(jù)Connection使用H264編碼器等各種配置,可以通過其實(shí)例方法查看。

AVCaptureFileOutput盡管可以配置很多參數(shù),但是其文件容器一定是蘋果的QuickTime格式,后綴名為.mov。如果需要使用其他格式視頻,只能值其代理方法結(jié)束錄制即完成文件封裝后再對原始視頻轉(zhuǎn)碼。

#pragma mark - Video Capture Methods
- (BOOL)isRecording {
    return self.movieOutput.isRecording;
}

- (void)startRecording {
    if (self.isRecording) {
        return;
    }
    AVCaptureConnection *videoConnection = [self.movieOutput connectionWithMediaType:AVMediaTypeVideo];
    // 設(shè)置視頻方向不會物理上旋轉(zhuǎn)像素,只是在創(chuàng)建Quick Time文件時會有相應(yīng)的矩陣變化
    if ([videoConnection isVideoOrientationSupported]) {
        videoConnection.videoOrientation = [self currentVideoOrientation];
    }
    // 防抖效果在實(shí)時預(yù)覽時并不可見,只會對最終生成的視頻數(shù)據(jù)生效
    if ([videoConnection isVideoStabilizationSupported]) {
        videoConnection.enablesVideoStabilizationWhenAvailable = YES;
    }
    
    // 默認(rèn)攝像頭會根據(jù)屏幕中心快速對焦,這樣在快速切換遠(yuǎn)近場景時會有脈沖式效果,采用平滑對焦方式可以減緩對焦速度,使快速的場景切換更自然
    AVCaptureDevice *device = [self activeCamera];
    if (device.isSmoothAutoFocusEnabled) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            device.smoothAutoFocusEnabled = YES;
            [device unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
    
    self.outputURL = [self uniqueURL];
    [self.movieOutput startRecordingToOutputFileURL:self.outputURL recordingDelegate:self];
}

- (NSURL *)uniqueURL {
    NSFileManager *fileManager = [NSFileManager defaultManager];
    NSString *dirPath = [fileManager temporaryDirectoryWithTemplateString:@"kamera.XXXXXX"];
    if (dirPath) {
        NSString *filePath = [dirPath stringByAppendingPathComponent:@"kamera_movie.mov"];
        return [NSURL fileURLWithPath:filePath];
    }
    return nil;
}

- (void)stopRecording {
    if ([self isRecording]) {
        [self.movieOutput stopRecording];
    }
}

- (CMTime)recordedDuration {
    return self.movieOutput.recordedDuration;
}

#pragma mark - AVCaptureFileOutputRecordingDelegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
      fromConnections:(NSArray *)connections
                error:(NSError *)error {
    if (error) {
        [self.delegate mediaCaptureFailedWithError:error];
    } else {
        [self writeVideoToPhotosLibrary:[self.outputURL copy]];
    }
    self.outputURL = nil;
}

- (void)writeVideoToPhotosLibrary:(NSURL *)videoURL {
    NSError *error = nil;
    __block PHObjectPlaceholder *createdAsset = nil;
    [[PHPhotoLibrary sharedPhotoLibrary] performChangesAndWait:^{
        createdAsset = [PHAssetCreationRequest creationRequestForAssetFromVideoAtFileURL:videoURL].placeholderForCreatedAsset;
    } error:&error];
    if (error || !createdAsset) {
        [self.delegate assetLibraryWriteFailedWithError:error];
    } else {
        [self generateThumbnailForVideoAtURL:videoURL];
    }
}

- (void)generateThumbnailForVideoAtURL:(NSURL *)videoURL {
    dispatch_async(self.videoQueue, ^{
        AVAsset *asset = [AVAsset assetWithURL:videoURL];
        AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
        imageGenerator.maximumSize = CGSizeMake(100.0f, 0.0f);
        // 捕捉縮略圖時考慮視頻方向的變化
        imageGenerator.appliesPreferredTrackTransform = YES;
        
        CGImageRef imageRef = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime:NULL error:nil];
        UIImage *image = [UIImage imageWithCGImage:imageRef];
        CGImageRelease(imageRef);
        
        dispatch_async(dispatch_get_main_queue(), ^{
            [self postThumbnailNotifification:image];
        });
    });
}

2 高級捕捉功能

2.1 視頻縮放

AVCaptureDevice提供了videoZoomFactor的屬性,可以設(shè)置視頻的縮放等級,通過其activeFormat屬性可以得到一個AVCaptureDeviceFormat實(shí)例,取videoMaxZoomFactor可以確定最大縮放等級。設(shè)備執(zhí)行縮放的效果是通過居中裁剪由攝像頭傳感器捕捉到的圖片來實(shí)現(xiàn)。通過設(shè)置AVCaptureDeviceFormat中的videoZoomFactor...Threshold可以設(shè)置放大圖像的點(diǎn)。

const CGFloat THZoomRate = 1.0f;

// KVO Contexts
static const NSString *THRampingVideoZoomContext;
static const NSString *THRampingVideoZoomFactorContext;

@implementation THCameraController
- (void)dealloc {
    [self.activeCamera removeObserver:self forKeyPath:@"videoZoomFactor"];
    [self.activeCamera removeObserver:self forKeyPath:@"rampingVideoZoom"];
}

- (BOOL)setupSessionInputs:(NSError **)error {
    BOOL success = [super setupSessionInputs:error];
    if (success) {
        [self.activeCamera addObserver:self forKeyPath:@"videoZoomFactor" options:NSKeyValueObservingOptionNew context:&THRampingVideoZoomFactorContext];
        [self.activeCamera addObserver:self forKeyPath:@"rampingVideoZoom" options:NSKeyValueObservingOptionNew context:&THRampingVideoZoomContext];
    }
    return success;
}

- (void)observeValueForKeyPath:(NSString *)keyPath
                      ofObject:(id)object
                        change:(NSDictionary *)change
                       context:(void *)context {
    if (context == &THRampingVideoZoomContext) {
        [self updateZoomingDelegate];
    } else if (context == &THRampingVideoZoomFactorContext) {
        if (self.activeCamera.isRampingVideoZoom) {
            [self updateZoomingDelegate];
        }
    } else {
        [super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
    }
}

- (void)updateZoomingDelegate {
    CGFloat curZoomFactor = self.activeCamera.videoZoomFactor;
    CGFloat maxZoomFactor = [self maxZoomFactor];
    CGFloat value = log(curZoomFactor)/log(maxZoomFactor);
    [self.zoomingDelegate rampedZoomToValue:value];
}

- (BOOL)cameraSupportsZoom {
    return self.activeCamera.activeFormat.videoMaxZoomFactor > 1.0f;
}

- (CGFloat)maxZoomFactor {
    // 4.0可以定義為其它縮放等級
    return MIN(self.activeCamera.activeFormat.videoMaxZoomFactor, 4.0f);
}

// 一次縮放,拖動縮放條到某一點(diǎn)時調(diào)用
- (void)setZoomValue:(CGFloat)zoomValue {
    if (!self.activeCamera.isRampingVideoZoom) {
        NSError *error;
        if ([self.activeCamera lockForConfiguration:&error]) {
            CGFloat zoomFactor = pow([self maxZoomFactor], zoomValue);
            self.activeCamera.videoZoomFactor = zoomFactor;
            [self.activeCamera unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}

// 逐漸縮放,拉動搖桿時調(diào)用
- (void)rampZoomToValue:(CGFloat)zoomValue {
    NSError *error;
    if ([self.activeCamera lockForConfiguration:&error]) {
        CGFloat zoomFactor = pow([self maxZoomFactor], zoomValue);
        // 當(dāng)rate = 1時,表示每秒增縮放因子的一倍
        [self.activeCamera rampToVideoZoomFactor:zoomFactor withRate:THZoomRate];
        [self.activeCamera unlockForConfiguration];
    } else {
        [self.delegate deviceConfigurationFailedWithError:error];
    }
}

// 取消縮放,設(shè)置當(dāng)前值,松開搖桿時調(diào)用
- (void)cancelZoom {
    NSError *error;
    if ([self.activeCamera lockForConfiguration:&error]) {
        [self.activeCamera cancelVideoZoomRamp];
        [self.activeCamera unlockForConfiguration];
    } else {
        [self.delegate deviceConfigurationFailedWithError:error];
    }
}

2.2 人臉檢測

iPhone自帶的相機(jī)應(yīng)用會在拍攝范圍內(nèi)出現(xiàn)人臉的時候檢測人臉信息,并以人臉的中心為焦點(diǎn)進(jìn)行聚焦。在AVFoundation中AVCaptureMetadataOut能實(shí)現(xiàn)人臉檢測功能。它返回的結(jié)果是AVMetadataFaceObject的數(shù)組,數(shù)組中每個元素代表一個人臉信息,它包含人臉位置(以設(shè)備坐標(biāo)系為基坐標(biāo)系),人臉傾斜角(人臉和肩膀之間的傾斜角,單位為度),偏轉(zhuǎn)角(人臉在y軸上的旋轉(zhuǎn)角,單位為度)。

在相機(jī)類中初始化人臉元數(shù)據(jù)輸出

- (BOOL)setupSessionOutputs:(NSError **)error {
    self.metadataOutput = [[AVCaptureMetadataOutput alloc] init];

    if ([self.captureSession canAddOutput:self.metadataOutput]) {
        [self.captureSession addOutput:self.metadataOutput];

        NSArray *metadataObjectTypes = @[AVMetadataObjectTypeFace];
        self.metadataOutput.metadataObjectTypes = metadataObjectTypes;

        dispatch_queue_t mainQueue = dispatch_get_main_queue();
        [self.metadataOutput setMetadataObjectsDelegate:self
                                                  queue:mainQueue];
        return YES;
    } else {
        if (error) {
            NSDictionary *userInfo = @{NSLocalizedDescriptionKey:
                                           @"Failed to still image output."};
            *error = [NSError errorWithDomain:THCameraErrorDomain
                                         code:THCameraErrorFailedToAddOutput
                                     userInfo:userInfo];
        }
        return NO;
    }
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
       fromConnection:(AVCaptureConnection *)connection {
    for (AVMetadataFaceObject *face in metadataObjects) {
        NSLog(@"Face detected with ID: %li",(long)face.faceID);
        NSLog(@"Face bounds: %@", NSStringFromCGRect(face.bounds));
    }
    [self.faceDetectionDelegate didDetectFaces:metadataObjects];
}

在預(yù)覽視圖中可視化人臉數(shù)據(jù)

@interface THPreviewView ()
@property (strong, nonatomic) CALayer *overlayLayer;
@property (strong, nonatomic) NSMutableDictionary *faceLayers;
@property (nonatomic, readonly) AVCaptureVideoPreviewLayer *previewLayer;
@end

@implementation THPreviewView

+ (Class)layerClass {
    return [AVCaptureVideoPreviewLayer class];
}

- (id)initWithFrame:(CGRect)frame {
    if (self = [super initWithFrame:frame]) {
        [self setupView];
    }
    return self;
}

- (void)setupView {
    self.faceLayers = [NSMutableDictionary dictionary];
    // 視頻預(yù)覽layer
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    // facelyaers的承載layer
    self.overlayLayer = [CALayer layer];
    self.overlayLayer.frame = self.bounds;
    // 設(shè)置其所有子Layer的視角,眼睛在z軸距離圖像1000個像素
    self.overlayLayer.sublayerTransform = CATransform3DMakePerspective(1000);
    [self.previewLayer addSublayer:self.overlayLayer];
}

- (AVCaptureSession*)session {
    return self.previewLayer.session;
}

- (void)setSession:(AVCaptureSession *)session {                            
    self.previewLayer.session = session;
}

- (AVCaptureVideoPreviewLayer *)previewLayer {
    return (AVCaptureVideoPreviewLayer *)self.layer;
}

- (void)didDetectFaces:(NSArray *)faces {
    NSArray *transformedFaces = [self transformedFacesFromFaces:faces];
    NSMutableArray *lostFaces = [self.faceLayers.allKeys mutableCopy];

    // 更新仍被監(jiān)測到的人臉數(shù)據(jù)
    for (AVMetadataFaceObject *face in transformedFaces) {
        NSNumber *faceID = @(face.faceID);
        [lostFaces removeObject:faceID];

        CALayer *layer = [self.faceLayers objectForKey:faceID];
        if (!layer) {
            layer = [self makeFaceLayer];
            [self.overlayLayer addSublayer:layer];
            self.faceLayers[faceID] = layer;
        }
        // 重新初始化人臉layer旋轉(zhuǎn)
        layer.transform = CATransform3DIdentity;
        // 設(shè)置人臉layer位置
        layer.frame = face.bounds;
        // 設(shè)置人臉layer傾斜角(即頭部向肩膀方向傾斜夾角)
        if (face.hasRollAngle) {
            CATransform3D t = [self transformForRollAngle:face.rollAngle];
            // CATransform3D進(jìn)行多次旋轉(zhuǎn)變化規(guī)則為兩個獨(dú)立的旋轉(zhuǎn)矩陣相乘
            layer.transform = CATransform3DConcat(layer.transform, t);
        }
        // 設(shè)置人臉layer旋轉(zhuǎn)角(即臉部對應(yīng)于y軸方向上的夾角)
        if (face.hasYawAngle) {
            CATransform3D t = [self transformForYawAngle:face.yawAngle];
            layer.transform = CATransform3DConcat(layer.transform, t);
        }
    }

    // 移除已經(jīng)丟失的人臉數(shù)據(jù),分別在字點(diǎn)中和視圖中移除
    for (NSNumber *faceID in lostFaces) {
        CALayer *layer = [self.faceLayers objectForKey:faceID];
        [layer removeFromSuperlayer];
        [self.faceLayers removeObjectForKey:faceID];
    }
}

// 將基于攝像頭坐標(biāo)系的人臉數(shù)組轉(zhuǎn)化為基于previewLayer坐標(biāo)系的人臉數(shù)據(jù)數(shù)組
- (NSArray *)transformedFacesFromFaces:(NSArray *)faces {
    NSMutableArray *transformedFaces = [NSMutableArray array];
    for (AVMetadataObject *face in faces) {
        AVMetadataObject *transformedFace =
            [self.previewLayer transformedMetadataObjectForMetadataObject:face];
        [transformedFaces addObject:transformedFace];
    }
    return transformedFaces;
}

- (CALayer *)makeFaceLayer {
    CALayer *layer = [CALayer layer];
    layer.borderWidth = 5.0f;
    layer.borderColor =
        [UIColor colorWithRed:0.188 green:0.517 blue:0.877 alpha:1.000].CGColor;
    return layer;
}
@end

在預(yù)覽視圖中處理人臉的傾斜和偏轉(zhuǎn)

// 繞Z軸旋轉(zhuǎn)
- (CATransform3D)transformForRollAngle:(CGFloat)rollAngleInDegrees {
    CGFloat rollAngleInRadians = THDegreesToRadians(rollAngleInDegrees);
    return CATransform3DMakeRotation(rollAngleInRadians, 0.0f, 0.0f, 1.0f);
}

// 繞Y軸旋轉(zhuǎn)
- (CATransform3D)transformForYawAngle:(CGFloat)yawAngleInDegrees {
    // 首先應(yīng)用偏轉(zhuǎn)角造成的繞Y軸旋轉(zhuǎn)
    CGFloat yawAngleInRadians = THDegreesToRadians(yawAngleInDegrees);
    CATransform3D yawTransform =
        CATransform3DMakeRotation(yawAngleInRadians, 0.0f, -1.0f, 0.0f);
    // 由于本示例中屏幕方向固定為Portrait,因此當(dāng)顛倒設(shè)備時人臉的角度會相對攝像頭做出改變,但是屏幕通常不會自動旋轉(zhuǎn),因此必須根據(jù)設(shè)備當(dāng)前方向修正旋轉(zhuǎn)矩陣。
    return CATransform3DConcat(yawTransform, [self orientationTransform]);
}

// 得到根據(jù)當(dāng)前設(shè)備方法需要進(jìn)行的視圖選擇矩陣,盡管在本次示例中在判斷偏轉(zhuǎn)角才調(diào)用,但是其也會影響傾斜角,因此因在判斷偏轉(zhuǎn)角和傾斜角后統(tǒng)一進(jìn)行邏輯處理
- (CATransform3D)orientationTransform {
    CGFloat angle = 0.0;
    switch ([UIDevice currentDevice].orientation) {
        case UIDeviceOrientationPortraitUpsideDown:
            angle = M_PI;
            break;
        case UIDeviceOrientationLandscapeRight:
            angle = -M_PI / 2.0f;
            break;
        case UIDeviceOrientationLandscapeLeft:
            angle = M_PI / 2.0f;
            break;
        default: //UIDeviceOrientationPortrait
            angle = 0.0;
            break;
    }
    return CATransform3DMakeRotation(angle, 0.0f, 0.0f, 1.0f);
}

// FaceMetadataObject中的傾斜角和偏轉(zhuǎn)角單位都是度,這里需要轉(zhuǎn)換為弧度
static CGFloat THDegreesToRadians(CGFloat degrees) {
    return degrees * M_PI / 180;
}

// 創(chuàng)建一個CATransform3D數(shù)據(jù)的函數(shù)
static CATransform3D CATransform3DMakePerspective(CGFloat eyePosition) {
    // CATransform3D 是一個4行4列矩陣,其中第一列、二、三列分別通過其中的4個元素可以得到新的坐標(biāo)系的x,y,z值,
    //第四列前三行元素分別是x,y,z的透視因子,默認(rèn)為0,圖像不會發(fā)生變化,其正負(fù)值有效,值越大會使圖像變形程度更高,
    // 該值可以理解為視角,通常通過-1.0 / distance獲得,distance表示眼睛離目標(biāo)3d模型從某一軸上的直線距離,
    // 可以為負(fù),遵循近大遠(yuǎn)小的原則,當(dāng)距離無限接近時,其對應(yīng)坐標(biāo)軸方向上的像素拉伸越夸張,當(dāng)距離無限大時,對應(yīng)坐標(biāo)軸上圖像不拉伸。
    // 第四列第四行默認(rèn)為1,沒有意義。
    CATransform3D transform = CATransform3DIdentity;
    transform.m34 = -1.0 / eyePosition;
    return transform;
}

通過上述功能結(jié)合Core Animation和Quartz框架,可以實(shí)現(xiàn)在人臉上添加帽子和眼鏡胡須等動態(tài)元素。在Apple Developer Connection網(wǎng)站中Apple’s SquareCam示例有詳細(xì)介紹。

2.3 機(jī)器可讀碼識別(二維碼等)

AVFoundation可以識別機(jī)器碼,機(jī)器可讀碼分條形碼和二維碼兩個大類,其下有很多小分類,AVFoundation支持的類型如下圖所示。

AVFoundation支持的條形碼
AVFoundation支持的二維碼

識別機(jī)器可讀碼通過AVCaptureMetadataOutput可以實(shí)現(xiàn)。
初始化相機(jī),添加二維碼輸出

@implementation THCameraController
- (NSString *)sessionPreset {
    // 識別條形碼和二維碼使用較小相機(jī)尺寸以提高效率
    return AVCaptureSessionPreset640x480;
}

// 盡管系統(tǒng)支持在任何距離都是用自動對焦功能,但是為了提高識別成功率,限制當(dāng)相機(jī)離物體近的時候才開啟自動對焦
- (BOOL)setupSessionInputs:(NSError *__autoreleasing *)error {
    BOOL success = [super setupSessionInputs:error];
    if (success) {
        if (self.activeCamera.autoFocusRangeRestrictionSupported) {
            if ([self.activeCamera lockForConfiguration:error]) {
                self.activeCamera.autoFocusRangeRestriction = AVCaptureAutoFocusRangeRestrictionNear;
                [self.activeCamera unlockForConfiguration];
            }
        }
    }
    return success;
}

- (BOOL)setupSessionOutputs:(NSError **)error {
    self.metadataOutput = [[AVCaptureMetadataOutput alloc] init];
    if ([self.captureSession canAddOutput:self.metadataOutput]) {
        [self.captureSession addOutput:self.metadataOutput];
        [self.metadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
        NSArray *typs = @[AVMetadataObjectTypeQRCode,AVMetadataObjectTypeAztecCode];
        self.metadataOutput.metadataObjectTypes = typs;
    } else {
        NSDictionary *userInfo = @{NSLocalizedDescriptionKey : @"Failed to add metadata output"};
        *error = [NSError errorWithDomain:THCameraErrorDomain code:THCameraErrorFailedToAddOutput userInfo:userInfo];
        return NO;
    }
    return YES;
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
       fromConnection:(AVCaptureConnection *)connection {
    [self.codeDetectionDelegate didDetectCodes:metadataObjects];
}
@end

處理檢測到的元數(shù)據(jù)數(shù)組,該數(shù)組由AVMetadataMachineReadableCodeObject元素組成,每個元素代表一個檢測到的機(jī)器碼對象,其含有三個重要的屬性,StringValue中定義了機(jī)器碼代碼的字符串,bounds定義了機(jī)器碼的視圖標(biāo)準(zhǔn)化的矩形邊界,corners定義了機(jī)器碼視圖的基于設(shè)備坐標(biāo)系的真實(shí)頂點(diǎn)數(shù)組,通過使用它可以展現(xiàn)出三維效果。

初始化預(yù)覽視圖,負(fù)責(zé)圖像顯示

@implementation THPreviewView
+ (Class)layerClass {
    return [AVCaptureVideoPreviewLayer class];
}

- (id)initWithFrame:(CGRect)frame {
    if (self = [super initWithFrame:frame]) {
        [self setupView];
    }
    return self;
}

- (void)setupView {
    _codeLayers = [NSMutableDictionary dictionaryWithCapacity:5];
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
}

- (AVCaptureSession*)session {
    return self.previewLayer.session;
}

- (void)setSession:(AVCaptureSession *)session {
    self.previewLayer.session = session;
}

- (AVCaptureVideoPreviewLayer *)previewLayer {
    return (AVCaptureVideoPreviewLayer *)self.layer;
}

- (void)didDetectCodes:(NSArray *)codes {
    NSArray *transformedCodes = [self transformedCodesFromCodes:codes];
    NSMutableArray *lostCodes = [self.codeLayers.allKeys mutableCopy];
    // 更新仍然存在的二維碼數(shù)據(jù)(實(shí)際應(yīng)用中通常掃描一次即關(guān)閉相機(jī))
    for (AVMetadataMachineReadableCodeObject *code in transformedCodes) {
        // 這里使用的是機(jī)器碼的String作為唯一標(biāo)識,因此當(dāng)有兩個一樣的二維碼時后面的會覆蓋前面的識別結(jié)果,如果需要都顯示,此處需要更新字典的key值設(shè)計(jì)方式
        NSString *stringValue = code.stringValue;
        if (stringValue) {
            [lostCodes removeObject:stringValue];
        } else {
            continue;
        }
        
        NSArray *layers = self.codeLayers[stringValue];
        if (!layers) {
            layers = @[[self makeBoundsLayer], [self makeCornersLayer]];
            self.codeLayers[stringValue] = layers;
            [self.previewLayer addSublayer:layers[0]];
            [self.previewLayer addSublayer:layers[1]];
        }
        
        // Bounds屬性代表了一個標(biāo)準(zhǔn)化后的二維機(jī)器碼邊界,corner屬性代表機(jī)器碼圖案的實(shí)際頂點(diǎn),使用它創(chuàng)建的Layer具有三維效果
        CAShapeLayer *boundsLayer = layers[0];
        boundsLayer.path = [self bezierPathForBounds:code.bounds].CGPath;
        CAShapeLayer *cornerLayer = layers[1];
        cornerLayer.path = [self bezierPathForCorners:code.corners].CGPath;
    }
    
    // 移除不在丟失追蹤的二維碼數(shù)據(jù)以及其layer
    for (NSString *stringValue in lostCodes) {
        for (CALayer *layer in self.codeLayers[stringValue]) {
            [layer removeFromSuperlayer];
        }
        [self.codeLayers removeObjectForKey:stringValue];
    }
}
@end

使用探測到的機(jī)器碼數(shù)據(jù),進(jìn)行描邊等操作

// 將基于設(shè)備坐標(biāo)系的二維碼數(shù)據(jù)轉(zhuǎn)換為基于視圖坐標(biāo)系的二維碼數(shù)據(jù)
- (NSArray *)transformedCodesFromCodes:(NSArray *)codes {
    NSMutableArray *transformedCodes = [NSMutableArray arrayWithCapacity:5];
    for (AVMetadataObject *code in codes) {
        AVMetadataObject *transformedCode = [self.previewLayer transformedMetadataObjectForMetadataObject:code];
        [transformedCodes addObject:transformedCode];
    }
    return transformedCodes.copy;
}

- (UIBezierPath *)bezierPathForBounds:(CGRect)bounds {
    return [UIBezierPath bezierPathWithRect:bounds];
}

- (CAShapeLayer *)makeBoundsLayer {
    CAShapeLayer *shapeLayer = [CAShapeLayer layer];
    shapeLayer.strokeColor = [UIColor colorWithRed:0.95f green:0.75f blue:0.06f alpha:1.0f].CGColor;
    shapeLayer.fillColor = nil;
    shapeLayer.lineWidth = 4.0f;
    return shapeLayer;
}

- (CAShapeLayer *)makeCornersLayer {
    CAShapeLayer *shapeLayer = [CAShapeLayer layer];
    shapeLayer.strokeColor = [UIColor colorWithRed:0.12 green:0.67 blue:0.42f alpha:1.0f].CGColor;
    shapeLayer.fillColor = [UIColor colorWithRed:0.19f green:0.75f blue:0.48f alpha:1.0f].CGColor;
    shapeLayer.lineWidth = 2.0f;
    return shapeLayer;
}

- (UIBezierPath *)bezierPathForCorners:(NSArray *)corners {
    UIBezierPath *path = [UIBezierPath bezierPath];
    for (int i = 0; i < corners.count; i++) {
        CGPoint point = [self pointForCorner:corners[i]];
        if (i == 0) {
            [path moveToPoint:point];
        } else {
            [path addLineToPoint:point];
        }
    }
    [path closePath];
    return path;
}

// 通過Corener字典建立point
- (CGPoint)pointForCorner:(NSDictionary *)corner {
    CGPoint point;
    CGPointMakeWithDictionaryRepresentation((CFDictionaryRef)corner, &point);
    return point;
}

2.4 高幀率捕捉圖像

高幀率(FPS)捕捉圖像能提高運(yùn)動場景的流暢度,也能支持高質(zhì)量的慢動作視頻效果。處理高幀率視頻分一下幾個階段:1)捕捉:AVFoundation框架默認(rèn)的幀率為30,在iPhone6s上已經(jīng)支持240的幀率捕捉,另外框架支持啟用dropped P-frames的h.264特性,使高幀率視頻能在舊的設(shè)備上流暢播放。2)播放:AVPlayer支持以多種幀率播放視頻,AVPlayerItem帶有audioTimePitchAlgorithm屬性優(yōu)化音頻處理。3)編輯:編輯功能將在該系列最后一篇文章中說明。4)導(dǎo)出:AVFoundation可以保存高幀率的視頻,它可以被導(dǎo)出或者轉(zhuǎn)換,例如轉(zhuǎn)換為標(biāo)準(zhǔn)30的幀率。

高幀率捕捉通過設(shè)置AVCaptureDevice的activeFormat以及FrameDuration實(shí)現(xiàn),它的formats屬性的以得到其接收者支持的所以格式,該數(shù)組由AVCaptureDeviceFormat組成,其定義了色彩抽樣方式以及可使用的幀率數(shù)組(video...rateRanges)等信息,AVFrameRateRange對象定義了幀率和幀時長等信息。

創(chuàng)建包含在AVCaptureDevice分類中的私有工具類THQualityOfService

@interface THQualityOfService : NSObject
@property(strong, nonatomic, readonly) AVCaptureDeviceFormat *format;
@property(strong, nonatomic, readonly) AVFrameRateRange *frameRateRange;
@property(assign, nonatomic, readonly) BOOL isHighFrameRate;

+ (instancetype)qosWithFormat:(AVCaptureDeviceFormat *)format frameRateRange:(AVFrameRateRange *)frameRateRange;
@end

@implementation THQualityOfService
+ (instancetype)qosWithFormat:(AVCaptureDeviceFormat *)format frameRateRange:(AVFrameRateRange *)frameRateRange {
    return [[self alloc] initWithFormat:format frameRateRange:frameRateRange];
}

- (instancetype)initWithFormat:(AVCaptureDeviceFormat *)format frameRateRange:(AVFrameRateRange *)frameRateRange {
    if (self = [super init]) {
        _format = format;
        _frameRateRange = frameRateRange;
    }
    return self;
}

- (BOOL)isHighFrameRate {
    return self.frameRateRange.maxFrameRate > 30.0f;
}
@end

完善AVCaptureDevice分類,完善高幀率邏輯

@implementation AVCaptureDevice (THAdditions)
- (BOOL)supportsHighFrameRateCapture {
    if (![self hasMediaType:AVMediaTypeVideo]) {
        return NO;
    }
    return [self findHighestQualityOfService].isHighFrameRate;
}

- (THQualityOfService *)findHighestQualityOfService {
    AVCaptureDeviceFormat *maxFormat = nil;
    AVFrameRateRange *maxFrameRateRange = nil;
    
    for (AVCaptureDeviceFormat *format in self.formats) {
        FourCharCode codecType = CMVideoFormatDescriptionGetCodecType(format.formatDescription);
        // CMFormatDescriptionRef 這是Core Media定義的類型,指的是圖像采集的方式,這里iPhone設(shè)備使用4:2:0的色彩抽樣格式,
        // 并且顏色范圍忽略不明顯值,具體見頭文件和該系列第一篇文章中色彩二次抽樣介紹
        if (codecType == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) {
            NSArray *frameRateRanges = format.videoSupportedFrameRateRanges;
            for (AVFrameRateRange *range in frameRateRanges) {
                if (range.maxFrameRate > maxFrameRateRange.maxFrameRate) {
                    maxFormat = format;
                    maxFrameRateRange = range;
                }
            }
        }
    }
    return [THQualityOfService qosWithFormat:maxFormat frameRateRange:maxFrameRateRange];
}

- (BOOL)enableMaxFrameRateCapture:(NSError **)error {
    THQualityOfService *qos = [self findHighestQualityOfService];
    if (!qos.isHighFrameRate) {
        NSString *message = @"Device does not support high FPS capture";
        NSDictionary *userInfo = @{NSLocalizedDescriptionKey : message};
        NSUInteger code = THCameraErrorHighFrameRateCaptureNotSupported;
        *error = [NSError errorWithDomain:THCameraErrorDomain code:code userInfo:userInfo];
        return NO;
    }
    
    if ([self lockForConfiguration:error]) {
        CMTime minFrameDuration = qos.frameRateRange.minFrameDuration;
        self.activeFormat = qos.format;
        // AVFoundation中控制捕捉設(shè)備是通過每一幀的時長(1/幀率)來控制,并不是幀率
        self.activeVideoMinFrameDuration = minFrameDuration;
        self.activeVideoMaxFrameDuration = minFrameDuration;
        [self unlockForConfiguration];
        return YES;
    }
    return NO;
}
@end

在相機(jī)控制器中使用高幀率功能

@implementation THCameraController
- (BOOL)cameraSupportsHighFrameRateCapture {
    return [self.activeCamera supportsHighFrameRateCapture];
}

- (BOOL)enableHighFrameRateCapture {
    NSError *error;
    BOOL enabled = [self.activeCamera enableMaxFrameRateCapture:&error];
    if (!enabled) {
        [self.delegate deviceConfigurationFailedWithError:error];
    }
    return enabled;
}
@end

2.5 處理捕捉到的視頻文件

AVCaptureMovieFileOutput是簡化的視頻捕捉類,當(dāng)需要更多自定義操作時,需要使用AVCaptureVideoDataOutput。它還可以結(jié)合OpenGL ES和Core Animation框架中的API將可視化貼圖等集成到最終生成的視頻中。這里暫不處理聲音部分AudioDataOutput。使用VideoDataOutput捕捉數(shù)據(jù)有兩個重要的代理方法。

  • captureOutput:didOutputSampleBuffer...:每當(dāng)新的視頻幀被捕捉到時將調(diào)用此方法,sampleBuffer中的數(shù)據(jù)組成方式基于VideoDataOutput設(shè)置的videoSettings。
  • captureOutput:didDropSampleBuffer...:當(dāng)前一個方法處理單個視頻幀耗費(fèi)過多的時間,后續(xù)的視頻幀不能按時到達(dá)時將會調(diào)用此方法,因此應(yīng)盡量加快前一個方法的處理效率。
2.5.1 CMSampleBuffer簡介

CMSampleBuffer可以包含媒體采樣數(shù)據(jù)、格式描述、元數(shù)據(jù)。CVPixelBuffer包含的像素?cái)?shù)據(jù)中,對于RGBA格式的像素?cái)?shù)據(jù)通常連續(xù)存儲,對于420等格式數(shù)據(jù)通常采用兩個plane將亮度數(shù)據(jù)和顏色分量數(shù)據(jù)分別存儲。
獲取VideoDataOutput輸出的樣本數(shù)據(jù)

// 對CVPixelBuffer包含的像素?cái)?shù)據(jù),將一張RGB圖變?yōu)榛叶葓D(這里沒有使用標(biāo)準(zhǔn)變換矩陣)
- (void)handleCVPixelBuffer {
    const int BYTES_PER_PIXEL = 4;
    CMSampleBufferRef sampleBuffer = nil;
    //pixelBuffer是unretained引用,不需要手動釋放內(nèi)存
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
    size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
    unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
    unsigned char grayPixel;
    
    for (int row = 0; row < bufferHeight; row++) {
        for (int column = 0; column < bufferWidth; column++) {
            // 盡管這里像素?cái)?shù)據(jù)應(yīng)該是0-255的整數(shù),但是這里是直接對二進(jìn)制位進(jìn)行相加,因此可以使用加法
            grayPixel = (pixel[0] + pixel[1] + pixel[2])/3;
            pixel[0] = pixel[1] = pixel[2] = grayPixel;
            pixel += BYTES_PER_PIXEL;
        }
    }
    
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    // 此處可以得到經(jīng)過處理的圖片
}

獲取格式描述
CMFormatDescriptionRef定義了媒體通用的一下屬性,根據(jù)媒體類型不同分為Video和AuidoDescriptionRef,其中分別定義特有屬性。

// 根據(jù)CMSamplebuffer的Format類型獲取圖像和音頻數(shù)據(jù)
- (void)getMeidaData {
    CMSampleBufferRef sampleBuffer = nil;
    CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
    CMMediaType mediaType = CMFormatDescriptionGetMediaType(formatDescription);
    if (mediaType == kCMMediaType_Video) {
        CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // 處理圖像數(shù)據(jù)
    } else if (mediaType == kCMMediaType_Audio) {
        CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
        // 處理音頻數(shù)據(jù)
    }
}

時間信息
CMSampleBuffer還定義了當(dāng)前幀的相對采集時間CMTime。

獲取元數(shù)據(jù)
CMSampleBuffer中還包含了可交換圖片格式(Exif)等元數(shù)據(jù)。

// 獲取可交換圖片文件格式(Exif)的附件,其中記錄了當(dāng)前幀的曝光模式、尺寸、白平衡等大量的元數(shù)據(jù)信息
- (void)getExifAttachment {
    CMSampleBufferRef sampleBuffer = nil;
    CFDictionaryRef exifAttachments = (CFDictionaryRef)CMGetAttachment(sampleBuffer, kCGImagePropertyExifDictionary, NULL);
}
2.5.1 使用AVCaptureVideoDataOutput

定義相機(jī)管理類

@protocol THTextureDelegate <NSObject>
// 當(dāng)新的紋理被創(chuàng)建時調(diào)用,target值紋理類型,OpenGL ES可用根據(jù)name拿到對應(yīng)的貼圖
- (void)textureCreatedWithTarget:(GLenum)target name:(GLuint)name;
@end

@interface THCameraController : THBaseCameraController
// context用于管理狀態(tài)的上下文,同時管理使用OpenGL ES進(jìn)行繪制所需要的資源
- (instancetype)initWithContext:(EAGLContext *)context;
@property (weak, nonatomic) id <THTextureDelegate> textureDelegate;
@end

捕捉圖像數(shù)據(jù)并轉(zhuǎn)換成OpenGL ES可用的貼圖
OpenGL ES基于GPU處理圖像,是高性能視頻處理的唯一解決方案,關(guān)于它的使用另起文章,此處只將AVFoundation捕捉到的圖片數(shù)據(jù)轉(zhuǎn)換為OpenGL ES可用的貼圖。此處使用GLKViewController作為根控制器。另外可閱讀Learning OpenGL ES for iOS這本書。

@interface THCameraController () <AVCaptureVideoDataOutputSampleBufferDelegate>
@property (weak, nonatomic) EAGLContext *context;
@property (strong, nonatomic) AVCaptureVideoDataOutput *videoDataOutput;

// 作為Core Video中像素buffer和OpenGL ES中貼圖橋梁,能減少數(shù)據(jù)再CPU和GPU之間傳輸?shù)拈_銷
@property (nonatomic) CVOpenGLESTextureCacheRef textureCache;
@property (nonatomic) CVOpenGLESTextureRef cameraTexture;
@end

@implementation THCameraController
- (instancetype)initWithContext:(EAGLContext *)context {
    if (self = [super init]) {
        _context = context;
        CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, _context, NULL, &_textureCache);
        if (err != kCVReturnSuccess) {
            NSLog(@"Error creating texture cache. %d", err);
        }
    }
    return self;
}

- (NSString *)sessionPreset {
    return AVCaptureSessionPreset640x480;
}

- (BOOL)setupSessionOutputs:(NSError **)error {
    self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
    // 攝像頭默認(rèn)格式是kCVPixelFormatType_420YpCbCr8Planar,但是OpenGL中常用的是BGRA模式,如果使用默認(rèn)設(shè)置,
    // 在得到CVPixelBuffer時其中的樣本數(shù)據(jù)將是YpCbCr數(shù)據(jù),而使用32BGRA將得到RGBA數(shù)據(jù)。
    // 但是這樣會降低攝像頭性能,并且得到的樣本尺寸更大
    self.videoDataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};
    // 調(diào)用代理方法必須在串行隊(duì)列,這里采用主隊(duì)列,也可以自定義
    [self.videoDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    if ([self.captureSession canAddOutput:self.videoDataOutput]) {
        [self.captureSession addOutput:self.videoDataOutput];
        return YES;
    }
    return NO;
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {
    CVReturn err;
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
    CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(formatDescription);
    
    // 此處寬高都傳height會在水平方向上裁剪視頻
    err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RGBA, dimensions.height, dimensions.height, GL_BGRA, GL_UNSIGNED_BYTE, 0, &_cameraTexture);
    if (!err) {
        GLenum target = CVOpenGLESTextureGetTarget(_cameraTexture);
        GLuint name = CVOpenGLESTextureGetName(_cameraTexture);
        [self.textureDelegate textureCreatedWithTarget:target name:name];
    } else {
        NSLog(@"Error at CVOpenGLESTextureCacheCreatTextureFromImage %d",err);
    }
    [self cleanupTextures];
}

- (void)cleanupTextures {
    if (_cameraTexture) {
        CFRelease(_cameraTexture);
        _cameraTexture = NULL;
    }
    CVOpenGLESTextureCacheFlush(_textureCache, 0);
}
@end
最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖,帶你破解...
    沈念sama閱讀 230,247評論 6 543
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 99,520評論 3 429
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人,你說我怎么就攤上這事。” “怎么了?”我有些...
    開封第一講書人閱讀 178,362評論 0 383
  • 文/不壞的土叔 我叫張陵,是天一觀的道長。 經(jīng)常有香客問我,道長,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 63,805評論 1 317
  • 正文 為了忘掉前任,我火速辦了婚禮,結(jié)果婚禮上,老公的妹妹穿的比我還像新娘。我一直安慰自己,他們只是感情好,可當(dāng)我...
    茶點(diǎn)故事閱讀 72,541評論 6 412
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著,像睡著了一般。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 55,896評論 1 328
  • 那天,我揣著相機(jī)與錄音,去河邊找鬼。 笑死,一個胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播,決...
    沈念sama閱讀 43,887評論 3 447
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 43,062評論 0 290
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 49,608評論 1 336
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 41,356評論 3 358
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點(diǎn)故事閱讀 43,555評論 1 374
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情,我是刑警寧澤,帶...
    沈念sama閱讀 39,077評論 5 364
  • 正文 年R本政府宣布,位于F島的核電站,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 44,769評論 3 349
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧,春花似錦、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 35,175評論 0 28
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 36,489評論 1 295
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留,地道東北人。 一個月前我還...
    沈念sama閱讀 52,289評論 3 400
  • 正文 我出身青樓,卻偏偏與公主長得像,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 48,516評論 2 379

推薦閱讀更多精彩內(nèi)容