小視頻錄制和播放

前段時間項目開發過程中遇到一個需求,想做一個類似微信那樣的小視頻,然后在錄制視頻的自定義圖層上播放。于是就研究了 AVFoundation 的一些東西。實際開發過程中也遇到了一些問題,所以在這里做下記錄。另外參考了SBVideoCaptureDemo的源碼。

使用AVCaptureSession、AVCaptureMovieFileOutput、AVCaptureDeviceInput、AVCaptureVideoPreviewLayer來錄制視頻,并通過AVAssetExportSeeion壓縮視頻并轉換為 MP4 格式。

使用AVPlayerLayer、AVPlayer、AVPlayerItem、NSURL自定義播放視頻

1、視頻的錄制


判斷用戶的設備對視頻錄制的支持情況

1、視頻錄制之前要先判斷攝像頭是否可用。

2、攝像頭是否被授權。

自定義頻錄制

對所用的幾個類做簡單說明

AVCaptureSession:媒體(音、視頻)捕獲會話,負責把捕獲的音視頻數據輸出到輸出設備中。一個AVCaptureSession可以有多個輸入輸出流。

AVCaptureDevice:輸入設備,包括麥克風、攝像頭,通過該對象可以設置物理設備的一些屬性(例如相機聚焦等)。

AVCaptureDeviceInput:設備輸入數據管理對象,可以根據AVCaptureDevice創建對應的AVCaptureDeviceInput對象,該對象將會被添加到AVCaptureSession中管理。

AVCaptureVideoPreviewLayer:相機拍攝預覽圖層,是CALayer的子類,使用該對象可以看到視頻錄制效果,創建該對象需要指定對應的AVCaptureSession對象。

AVCaptureMovieFileOutput:視頻輸出流。把一個輸入或者輸出添加到AVCaptureSession之后AVCaptureSession就會在所有相符的輸入、輸出設備之間 建立連接(AVCaptionConnection)。

//此狀態表示視頻制作時的各個狀態

typedefNS_ENUM(NSInteger, VideoState)

{

VideoStateFree = 0,

VideoStateWillStartRecord,

VideoStateDidStartRecord,

VideoStateWillEndRecord,

VideoStateDidEndRecord,

VideoStateWillStartMerge,

VideoStateDidStartMerge,

};

//與VideoState不同

//此狀態表示用戶操作時的狀態,比如:已經開始錄制、停止錄制

typedefNS_ENUM(NSInteger, RecordOptState)

{

RecordOptStateFree = 0,

RecordOptStateBegin,

RecordOptStateEnd,

};

//錄制時用戶手指所處區域,可以用來判斷是在錄制區域還是在取消錄制區域

typedefNS_ENUM(NSInteger, CurrentRecordRegion)

{

CurrentRecordRegionFree = 0,

CurrentRecordRegionRecord,

CurrentRecordRegionCancelRecord,

};

初始化相關設置

self.captureSession= [[AVCaptureSessionalloc]init];

AVCaptureDevice*frontCamera =nil;

AVCaptureDevice*backCamera =nil;

NSArray*cameras = [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];

for(AVCaptureDevice*cameraincameras) {

if(AVCaptureDevicePositionFront== camera.position) {//前置攝像頭

frontCamera = camera;

}

elseif(AVCaptureDevicePositionBack== camera.position)

{

backCamera = camera;

}

//默認使用后攝像機

[backCamera lockForConfiguration:nil];//先鎖定設備

if([backCamera isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) {

[backCamerasetExposureMode:AVCaptureExposureModeContinuousAutoExposure];//曝光量調節

}

if([backCameraisFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]) {//焦點CGPoint

[backCamerasetFocusMode:AVCaptureFocusModeContinuousAutoFocus];

}

[backCameraunlockForConfiguration];

[self.captureSessionbeginConfiguration];

//input device

self.videoDeviceInput= [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];

AVCaptureDeviceInput*audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]error:nil];

if([self.captureSessioncanAddInput:self.videoDeviceInput]) {

[self.captureSessionaddInput:self.videoDeviceInput];

}

if([self.captureSessioncanAddInput:audioDeviceInput]) {

[self.captureSessionaddInput:audioDeviceInput];

}

//output device

self.movieFileOutput= [[AVCaptureMovieFileOutputalloc]init];

if([self.captureSessioncanAddOutput:self.movieFileOutput]) {

[self.captureSessionaddOutput:self.movieFileOutput];

}

//preset

if([self.captureSessioncanSetSessionPreset:AVCaptureSessionPreset640x480]) {

self.captureSession.sessionPreset=AVCaptureSessionPreset640x480;//AVCaptureSessionPresetLow

}

//preview layer

self.preViewLayer= [AVCaptureVideoPreviewLayerlayerWithSession:self.captureSession];

self.preViewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;

[self.captureSession commitConfiguration];

[self.captureSession startRunning];//會話 開始運行

注意:改變設備屬性前一定要首先調用lockForConfiguration方法加鎖,調用完之后使用unlockForConfiguration方法解鎖。對相機設置時,要判斷當前設備是否支持改設置。比如:isExposureModeSupported、isFocusModeSupported等。

//開始錄制

- (void)startRecordingToOutputFileURL

{

_videoState=VideoStateWillStartRecord;

_recordOptState=RecordOptStateBegin;

//根據設備輸出獲得連接

AVCaptureConnection*captureConnection = [self.movieFileOutputconnectionWithMediaType:AVMediaTypeVideo];

//根據連接取得設備輸出的數據

if(![self.movieFileOutputisRecording]) {

//預覽圖層和視頻方向保持一致

captureConnection.videoOrientation= [self.preViewLayerconnection].videoOrientation;

[self.movieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:[self getVideoSaveFilePathString]] recordingDelegate:self];//開始錄制

}

else

{

[selfstopCurrentVideoRecording];

}

//停止錄制

- (void)stopCurrentVideoRecording

{

[self stopCountDurTimer];//停止計時器

_videoState=VideoStateWillEndRecord;

[self.movieFileOutput stopRecording];//停止錄制

}

#pragma mark - AVCaptureFileOutputRecordingDelegate

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections

{

_videoState = VideoStateDidStartRecord;

self.videoSaveFilePath = [fileURL absoluteString];

self.currentFileURL = fileURL;

self.currentVideoDur = 0.0f;

self.totalVideoDur = 0.0f;

[self startCountDurTimer];//啟動錄制計時器

//這里拋出開始錄制 代理

if (RecordOptStateEnd == _recordOptState) {//時間太短,還沒開始錄制,就已經松開了錄制按鈕,要停止正在錄制的視頻

[self stopCurrentVideoRecording];

}

}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error

{

_videoState = VideoStateDidEndRecord;

self.totalVideoDur += _currentVideoDur;

//這里拋出錄制完成 代理

if (CurrentRecordRegionRecord == [self getCurrentRecordRegion]) {

if (self.totalVideoDur < MIN_VIDEO_DUR) {//錄制時間太短

[self removeMovFile];//移除mov格式的視頻文件

_videoState = VideoStateFree;

}

}

else

{

[self removeMovFile];//移除mov格式的視頻文件

_videoState = VideoStateFree;

}

}

//將mov格式轉化成MP4

- (void)mergeAndExportVideosAtFileURLs:(NSArray*)fileURLArray

{

_videoState = VideoStateWillStartMerge;

NSError *error = nil;

//渲染尺寸

CGSize renderSize = CGSizeMake(0, 0);

NSMutableArray *layerInstructionArray = [NSMutableArray array];

//用來合成視頻

AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];

CMTime totalDuration = kCMTimeZero;

//先取assetTrack 也為了取renderSize

NSMutableArray *assetTrackArray = [NSMutableArray array];

NSMutableArray *assetArray = [NSMutableArray array];

for (NSURL *fileURL in fileURLArray) {

//AVAsset:素材庫里的素材

AVAsset *asset = [AVAsset assetWithURL:fileURL];

if (!asset) {

continue;

}

[assetArray addObject:asset];

//素材的軌道

AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];//返回一個數組AVAssetTracks資產

[assetTrackArray addObject:assetTrack];

renderSize.width = MAX(renderSize.width, assetTrack.naturalSize.height);

renderSize.height = MAX(renderSize.height, assetTrack.naturalSize.width);

}

CGFloat renderW = 320;//MIN(renderSize.width, renderSize.height);

for (NSInteger i = 0; i < [assetArray count] && i < assetTrackArray.count; i++) {

AVAsset *asset = [assetArray objectAtIndex:i];

AVAssetTrack *assetTrack = [assetTrackArray objectAtIndex:i];

//文件中的音頻軌道,里面可以插入各種對應的素材

AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

NSArray*dataSourceArray= [asset tracksWithMediaType:AVMediaTypeAudio];//獲取聲道,即麥克風相關信息

[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:((dataSourceArray.count > 0)?[dataSourceArray objectAtIndex:0]:nil) atTime:totalDuration error:nil];

//工程文件中的軌道,有音頻軌,里面可以插入各種對應的素材

AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetTrack atTime:totalDuration error:&error];

//視頻軌道中的一個視頻,可以縮放、旋轉等

AVMutableVideoCompositionLayerInstruction *layerInstrucition = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

totalDuration = CMTimeAdd(totalDuration, asset.duration);

CGFloat rate = renderW / MIN(assetTrack.naturalSize.width, assetTrack.naturalSize.height);

CGAffineTransform layerTransform = CGAffineTransformMake(assetTrack.preferredTransform.a, assetTrack.preferredTransform.b, assetTrack.preferredTransform.c, assetTrack.preferredTransform.d, assetTrack.preferredTransform.tx * rate, assetTrack.preferredTransform.ty * rate);

layerTransform = CGAffineTransformConcat(layerTransform, CGAffineTransformMake(1, 0, 0, 1, 0, -(assetTrack.naturalSize.width - assetTrack.naturalSize.height) / 2.0));//向上移動取中部影相

layerTransform = CGAffineTransformScale(layerTransform, rate, rate);//放縮,解決前后攝像結果大小不對稱

[layerInstrucition setTransform:layerTransform atTime:kCMTimeZero];

[layerInstrucition setOpacity:0.0 atTime:totalDuration];

//data

[layerInstructionArray addObject:layerInstrucition];

}

//get save path

NSURL *mergeFileURL = [NSURL fileURLWithPath:[self getVideoMergeFilePathString]];

//export

AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalDuration);

mainInstruction.layerInstructions = layerInstructionArray;

AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];

mainCompositionInst.instructions = @[mainInstruction];

mainCompositionInst.frameDuration = CMTimeMake(1, 100);

//? ? mainCompositionInst.renderSize = CGSizeMake(renderW, renderW * (sH/sW));

mainCompositionInst.renderSize = CGSizeMake(renderW, renderW * 0.75);//4:3比列

//資源導出

AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];

exporter.videoComposition = mainCompositionInst;

exporter.outputURL = mergeFileURL;

exporter.outputFileType = AVFileTypeMPEG4;//視頻格式MP4

exporter.shouldOptimizeForNetworkUse = YES;

[exporter exportAsynchronouslyWithCompletionHandler:^{

dispatch_async(dispatch_get_main_queue(), ^{

_videoState = VideoStateDidStartMerge;

//拋出轉換成功 代理

[self removeMovFile];//移除MOV格式視頻

});

}];

}

//計算視頻大小

- (NSInteger) getFileSize:(NSString*) path

{

path = [pathstringByReplacingOccurrencesOfString:@"file://"withString:@""];

NSFileManager* filemanager = [NSFileManagerdefaultManager];

if([filemanagerfileExistsAtPath:path]){

NSDictionary* attributes = [filemanagerattributesOfItemAtPath:patherror:nil];

NSNumber*theFileSize;

if( (theFileSize = [attributesobjectForKey:NSFileSize]) )

return[theFileSizeintValue]/1024;

else

return-1;

}

else

{

return-1;

}

}

//拉近、拉遠鏡頭

- (void)changeDeviceVideoZoomFactor

{

AVCaptureDevice*backCamera = [selfgetCameraDevice:NO];

CGFloatcurrent = 1.0;

if(1.0 == backCamera.videoZoomFactor) {

current = 2.0f;

if(current > backCamera.activeFormat.videoMaxZoomFactor) {

current = backCamera.activeFormat.videoMaxZoomFactor;

}

}

NSError*error =nil;

if([backCameralockForConfiguration:&error]) {

[backCamerarampToVideoZoomFactor:currentwithRate:10];

[backCameraunlockForConfiguration];

}

else

{

NSLog(@"鎖定設備過程error,錯誤信息:%@",error.localizedDescription);

}

}

- (AVCaptureDevice*)getCameraDevice:(BOOL)isFront

{

NSArray*cameras = [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];

AVCaptureDevice*frontCamera;

AVCaptureDevice*backCamera;

for(AVCaptureDevice*cameraincameras) {

if(AVCaptureDevicePositionFront== camera.position) {

frontCamera = camera;

}

elseif(AVCaptureDevicePositionBack== camera.position)

{

backCamera = camera;

}

}

if(isFront) {

returnfrontCamera;

}

returnbackCamera;

}


2、自定義播放視頻

//注意:播放視頻的URL是fileURLWithPath。格式是:“file://var

- (instancetype)initVideoFileURL:(NSURL*)videoFileURL withFrame:(CGRect)frame withView:(UIView*)view

{

self= [superinit];

if(self) {

self.videoFileURL= videoFileURL;

[selfregisterNotficationMessage];

[selfinitPlayLayer:framewithView:view];

}

returnself;

}

- (void)initPlayLayer:(CGRect)rect withView:(UIView*)view

{

if(!_videoFileURL) {

return;

}

AVAsset*asset = [AVURLAssetURLAssetWithURL:_videoFileURLoptions:nil];

self.playerItem= [AVPlayerItemplayerItemWithAsset:asset];

//self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

self.player= [[AVPlayeralloc]init];

self.playerLayer= [AVPlayerLayerplayerLayerWithPlayer:self.player];

[self.playersetVolume:0.0f];//靜音

[self.playerseekToTime:kCMTimeZero];

[self.playersetActionAtItemEnd:AVPlayerActionAtItemEndNone];

[self.playerreplaceCurrentItemWithPlayerItem:self.playerItem];

self.playerLayer.frame= rect;

self.playerLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;

[view.layeraddSublayer:self.playerLayer];

}

- (void)playSight

{

[self.playerItemseekToTime:kCMTimeZero];

[self.playerplay];

}

- (void)pauseSight

{

[self.playerItemseekToTime:kCMTimeZero];

[self.playerpause];

}

- (void)releaseVideoPlayer

{

[selfremoveNotificationMessage];

if(self.player) {

[self.playerpause];

[self.playerreplaceCurrentItemWithPlayerItem:nil];

}

if(self.playerLayer) {

[self.playerLayerremoveFromSuperlayer];

}

self.player=nil;

self.playerLayer=nil;

self.playerItem=nil;

self.videoFileURL=nil;

}

#pragma mark - notification message

- (void)registerNotficationMessage

{

[[NSNotificationCenterdefaultCenter]addObserver:selfselector:@selector(avPlayerItemDidPlayToEnd:)name:AVPlayerItemDidPlayToEndTimeNotificationobject:nil];

}

- (void)removeNotificationMessage

{

[[NSNotificationCenterdefaultCenter]removeObserver:selfname:AVPlayerItemDidPlayToEndTimeNotificationobject:nil];

}

- (void)avPlayerItemDidPlayToEnd:(NSNotification*)notification

{

if(notification.object!=self.playerItem) {

return;

}

[self.playerItemseekToTime:kCMTimeZero];

[self.playerplay];

}

有關AVFoundation的知識點,還有很多。以后如果有其它的需求再做研究。

源碼地址?https://github.com/zone1026/SightRecorder

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。

推薦閱讀更多精彩內容