視頻錄制,壓縮導(dǎo)出,取幀

視頻錄制

首先,我們彈出系統(tǒng)的視頻錄制界面,也就是UIImagePickerController控制器來實(shí)現(xiàn),但是我們需要驗(yàn)證用戶授權(quán),只有有錄制視頻的權(quán)限,才能繼續(xù)往下。

我們還需要判斷UIImagePickerControllerSourceTypeCamera是否支持,比如模擬器就不支持,當(dāng)然真機(jī)是否有不支持的并不知道,不過更安全的寫法是要這么寫的。視頻錄制可以設(shè)置錄制的視頻的質(zhì)量,也就是分辨率的高低,通過videoQuality屬性來設(shè)置。我們還可以設(shè)置錄制視頻的最大時長,通過videoMaximumDuration屬性設(shè)置,比如這里設(shè)置為5分鐘。

// 7.0AVAuthorizationStatusauthStatus = [AVCaptureDeviceauthorizationStatusForMediaType:AVMediaTypeVideo];if(authStatus ==AVAuthorizationStatusRestricted|| authStatus ==AVAuthorizationStatusDenied) {NSLog(@"攝像頭已被禁用,您可在設(shè)置應(yīng)用程序中進(jìn)行開啟");return;}if([UIImagePickerControllerisSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {UIImagePickerController*picker = [[UIImagePickerControlleralloc] init];? picker.delegate=self;? picker.allowsEditing=YES;? picker.sourceType=UIImagePickerControllerSourceTypeCamera;? picker.videoQuality=UIImagePickerControllerQualityType640x480;//錄像質(zhì)量picker.videoMaximumDuration=5*60.0f;// 限制視頻錄制最多不超過5分鐘picker.mediaTypes= @[(NSString*)kUTTypeMovie];? [selfpresentViewController:picker animated:YEScompletion:NULL];self.shouldAsync=YES;}else{NSLog(@"手機(jī)不支持?jǐn)z像");}

然后實(shí)現(xiàn)代理,就可以拿到錄制的視頻了。

從相冊選擇視頻

從相冊選擇視頻與彈出錄制視頻的代碼差不多,只是sourceType不一樣而已。我們一樣要求先判斷權(quán)限,用戶是否授權(quán),若不允許,就沒有辦法了。

指定sourceType為UIImagePickerControllerSourceTypeSavedPhotosAlbum就是獲取保存到相冊中的media。我們還要指定mediaTypes,只需要設(shè)置為kUTTypeMovie就可以了。

AVAuthorizationStatusauthStatus = [AVCaptureDeviceauthorizationStatusForMediaType:AVMediaTypeVideo];if(authStatus ==AVAuthorizationStatusRestricted|| authStatus ==AVAuthorizationStatusDenied) {NSLog(@"攝像頭已被禁用,您可在設(shè)置應(yīng)用程序中進(jìn)行開啟");return;}if([UIImagePickerControllerisSourceTypeAvailable:UIImagePickerControllerSourceTypeSavedPhotosAlbum]) {UIImagePickerController*picker = [[UIImagePickerControlleralloc] init];? picker.delegate=self;? picker.allowsEditing=YES;? picker.sourceType=UIImagePickerControllerSourceTypeSavedPhotosAlbum;? picker.mediaTypes= @[(NSString*)kUTTypeMovie];? [selfpresentViewController:picker animated:YEScompletion:NULL];self.shouldAsync=NO;}else{NSLog(@"手機(jī)不支持?jǐn)z像");}

同樣,實(shí)現(xiàn)代理方法,就可以取到所選擇的視頻了。

保存視頻到相冊

寫入相冊可以通過ALAssetsLibrary類來實(shí)現(xiàn),它提供了寫入相冊的API,異步寫入,完成是要回到主線程更新UI:

NSURL*videoURL = [info objectForKey:UIImagePickerControllerMediaURL];ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];dispatch_async(dispatch_get_global_queue(0,0), ^{// 判斷相冊是否兼容視頻,兼容才能保存到相冊if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) {? ? [library writeVideoAtPathToSavedPhotosAlbum:videoURL completionBlock:^(NSURL*assetURL,NSError*error) {dispatch_async(dispatch_get_main_queue(), ^{// 寫入相冊if(error ==nil) {NSLog(@"寫入相冊成功");? ? ? ? }else{NSLog(@"寫入相冊失敗");? ? ? ? }? ? ? }? ? }];? }});

獲取視頻幀圖

同步獲取幀圖

同步獲取中間幀,需要指定哪個時間點(diǎn)的幀,當(dāng)獲取到以后,返回來的圖片對象是CFRetained過的,需要外面手動CGImageRelease一下,釋放內(nèi)存。通過AVAsset來訪問具體的視頻資源,然后通過AVAssetImageGenerator圖片生成器來生成某個幀圖片:

// Get the video's center frame as video poster image- (UIImage*)frameImageFromVideoURL:(NSURL*)videoURL {// resultUIImage*image =nil;// AVAssetImageGeneratorAVAsset*asset = [AVAssetassetWithURL:videoURL];AVAssetImageGenerator*imageGenerator = [[AVAssetImageGeneratoralloc] initWithAsset:asset];? imageGenerator.appliesPreferredTrackTransform=YES;// calculate the midpoint time of videoFloat64 duration = CMTimeGetSeconds([asset duration]);// 取某個幀的時間,參數(shù)一表示哪個時間(秒),參數(shù)二表示每秒多少幀// 通常來說,600是一個常用的公共參數(shù),蘋果有說明:// 24 frames per second (fps) for film, 30 fps for NTSC (used for TV in North America and// Japan), and 25 fps for PAL (used for TV in Europe).// Using a timescale of 600, you can exactly represent any number of frames in these systemsCMTime midpoint = CMTimeMakeWithSeconds(duration /2.0,600);// get the image fromNSError*error =nil;? CMTime actualTime;// Returns a CFRetained CGImageRef for an asset at or near the specified time.// So we should mannully release itCGImageRefcenterFrameImage = [imageGeneratorcopyCGImageAtTime:midpoint? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? actualTime:&actualTime? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? error:&error];if(centerFrameImage !=NULL) {? ? image = [[UIImagealloc] initWithCGImage:centerFrameImage];// Release the CFRetained imageCGImageRelease(centerFrameImage);? }returnimage;}

異步獲取幀圖

異步獲取某個幀的圖片,與同步相比,只是調(diào)用API不同,可以傳多個時間點(diǎn),然后計算出實(shí)際的時間并返回圖片,但是返回的圖片不需要我們手動再release了。有可能取不到圖片,所以還需要判斷是否是AVAssetImageGeneratorSucceeded,是才轉(zhuǎn)換圖片:

// 異步獲取幀圖片,可以一次獲取多幀圖片- (void)centerFrameImageWithVideoURL:(NSURL*)videoURL completion:(void(^)(UIImage*image))completion {// AVAssetImageGeneratorAVAsset*asset = [AVAssetassetWithURL:videoURL];AVAssetImageGenerator*imageGenerator = [[AVAssetImageGeneratoralloc] initWithAsset:asset];? imageGenerator.appliesPreferredTrackTransform=YES;// calculate the midpoint time of videoFloat64 duration = CMTimeGetSeconds([asset duration]);// 取某個幀的時間,參數(shù)一表示哪個時間(秒),參數(shù)二表示每秒多少幀// 通常來說,600是一個常用的公共參數(shù),蘋果有說明:// 24 frames per second (fps) for film, 30 fps for NTSC (used for TV in North America and// Japan), and 25 fps for PAL (used for TV in Europe).// Using a timescale of 600, you can exactly represent any number of frames in these systemsCMTime midpoint = CMTimeMakeWithSeconds(duration /2.0,600);// 異步獲取多幀圖片NSValue*midTime = [NSValuevalueWithCMTime:midpoint];? [imageGenerator generateCGImagesAsynchronouslyForTimes:@[midTime] completionHandler:^(CMTime requestedTime,CGImageRef_Nullable image, CMTime actualTime,AVAssetImageGeneratorResultresult,NSError* _Nullable error) {if(result ==AVAssetImageGeneratorSucceeded&& image !=NULL) {UIImage*centerFrameImage = [[UIImagealloc] initWithCGImage:image];dispatch_async(dispatch_get_main_queue(), ^{if(completion) {? ? ? ? ? completion(centerFrameImage);? ? ? ? }? ? ? });? ? }else{dispatch_async(dispatch_get_main_queue(), ^{if(completion) {? ? ? ? ? completion(nil);? ? ? ? }? ? ? });? ? }? }];}

壓縮并導(dǎo)出視頻

壓縮視頻是因?yàn)橐曨l分辨率過高所生成的視頻的大小太大了,對于移動設(shè)備來說,內(nèi)存是不能太大的,如果不支持分片上傳到服務(wù)器,或者不支持流上傳、文件上傳,而只能支持表單上傳,那么必須要限制大小,壓縮視頻。

就像我們在使用某平臺的視頻的上傳的時候,到現(xiàn)在還沒有支持流上傳,也不支持文件上傳,只支持表單上傳,導(dǎo)致視頻大一點(diǎn)就會閃退。流上傳是上傳成功了,但是人家后臺不識別,這一次讓某平臺坑壞了。直接用file上傳,也傳過去了,上傳進(jìn)度100%了,但是人家那邊還是作為失敗處理,無奈!

言歸正傳,壓縮、導(dǎo)出視頻,需要通過AVAssetExportSession來實(shí)現(xiàn),我們需要指定一個preset,并判斷是否支持這個preset,只有支持才能使用。

我們這里設(shè)置的preset為AVAssetExportPreset640x480,屬于壓縮得比較厲害的了,這需要根據(jù)服務(wù)器視頻上傳的支持程度而選擇的。然后通過調(diào)用異步壓縮并導(dǎo)出視頻:

- (void)compressVideoWithVideoURL:(NSURL*)videoURL? ? ? ? ? ? ? ? ? ? ? ? savedName:(NSString*)savedName? ? ? ? ? ? ? ? ? ? ? completion:(void(^)(NSString*savedPath))completion {// Accessing video by URLAVURLAsset*videoAsset = [[AVURLAssetalloc] initWithURL:videoURL options:nil];// Find compatible presets by video asset.NSArray*presets = [AVAssetExportSessionexportPresetsCompatibleWithAsset:videoAsset];// Begin to compress video// Now we just compress to low resolution if it supports// If you need to upload to the server, but server does't support to upload by streaming,// You can compress the resolution to lower. Or you can support more higher resolution.if([presets containsObject:AVAssetExportPreset640x480]) {AVAssetExportSession*session = [[AVAssetExportSessionalloc] initWithAsset:videoAsset? presetName:AVAssetExportPreset640x480];NSString*doc = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];NSString*folder = [doc stringByAppendingPathComponent:@"HYBVideos"];BOOLisDir =NO;BOOLisExist = [[NSFileManagerdefaultManager] fileExistsAtPath:folder isDirectory:&isDir];if(!isExist || (isExist && !isDir)) {NSError*error =nil;? ? ? [[NSFileManagerdefaultManager] createDirectoryAtPath:folder? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? withIntermediateDirectories:YESattributes:nilerror:&error];if(error ==nil) {NSLog(@"目錄創(chuàng)建成功");? ? ? }else{NSLog(@"目錄創(chuàng)建失敗");? ? ? }? ? }NSString*outPutPath = [folder stringByAppendingPathComponent:savedName];? ? session.outputURL= [NSURLfileURLWithPath:outPutPath];// Optimize for network use.session.shouldOptimizeForNetworkUse=true;NSArray*supportedTypeArray = session.supportedFileTypes;if([supportedTypeArray containsObject:AVFileTypeMPEG4]) {? ? ? session.outputFileType=AVFileTypeMPEG4;? ? }elseif(supportedTypeArray.count==0) {NSLog(@"No supported file types");return;? ? }else{? ? ? session.outputFileType= [supportedTypeArray objectAtIndex:0];? ? }// Begin to export video to the output path asynchronously.[session exportAsynchronouslyWithCompletionHandler:^{if([session status] ==AVAssetExportSessionStatusCompleted) {dispatch_async(dispatch_get_main_queue(), ^{if(completion) {? ? ? ? ? ? completion([session.outputURLpath]);? ? ? ? ? }? ? ? ? });? ? ? }else{dispatch_async(dispatch_get_main_queue(), ^{if(completion) {? ? ? ? ? ? completion(nil);? ? ? ? ? }? ? ? ? });? ? ? }? ? }];? }}

解決iOS8上錄視頻引起的偏移bug

在iOS8上有這么一樣bug:彈出錄制視頻頁面,再回來發(fā)現(xiàn)整個view都往下移動了,可能網(wǎng)上有很多解決辦法,下面只是其中一種:

[picker dismissViewControllerAnimated:YEScompletion:^{// for fixing iOS 8.0 problem that frame changed when open camera to record video.self.tabBarController.view.frame= [[UIScreenmainScreen] bounds];? ? [self.tabBarController.viewlayoutIfNeeded];}];

Tip:記得在選擇或者取消的代理中都調(diào)用!

文/標(biāo)哥的技術(shù)博客(簡書作者)

原文鏈接:http://www.lxweimin.com/p/6f23f608048e

著作權(quán)歸作者所有,轉(zhuǎn)載請聯(lián)系作者獲得授權(quán),并標(biāo)注“簡書作者”。

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

推薦閱讀更多精彩內(nèi)容