iOS音視頻---使用AVFoundation(AVCapture+AVAssetWriter+AVPlayer)采集錄制和播放音視頻

首先了解下AVFoundation

AVFoundation框架是ios中很重要的框架,是蘋果 OS X 系統和 iOS系統中用于處理基于時間的媒體數據的高級框架,其設計過程高度依賴多線程機制。所有與視頻音頻相關的軟硬件控制都在這個框架里面。
AVFoundation是可以用它來播放和創建基于時間的視聽媒體的幾個框架之一,它提供了基于時間的視聽數據的詳細界別上的OC接口。可以用它來檢查、創建、編輯、重新編碼媒體文件。也可以從設備得到輸入流和實時捕捉回放過程中操控視頻,是用于處理基于時間的媒體數據的高級OC框架。充分利用了多核硬件的優勢并大量使用block和Grand Central Dispatch(GCD)機制將復雜的計算進程放在后臺線程運行。自動提供硬件加速操作,確保在大部分設備上應用程序能以最佳性能運行。

而AVAssetWriter和AVPlayer也是屬于AVFoundation框架內的
Github代碼地址

一、采集音視頻

/**  負責輸入和輸出設備之間的數據傳遞  */
@property (strong, nonatomic) AVCaptureSession *captureSession;
/**  視頻輸入  */
@property (nonatomic, strong) AVCaptureDeviceInput *videoInput;
/**  視頻輸出  */
@property (nonatomic, strong) AVCaptureVideoDataOutput *videoOutput;
/**  聲音輸出  */
@property (nonatomic, strong) AVCaptureAudioDataOutput *audioOutput;
/**  預覽圖層  */
@property (strong, nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;

這里使用這些類來采集音視頻。

首先,初始化AVCaptureSession

- (AVCaptureSession *)captureSession
{
    if (_captureSession == nil)
    {
        _captureSession = [[AVCaptureSession alloc] init];
        
        if ([_captureSession canSetSessionPreset:AVCaptureSessionPresetHigh])
        {
            _captureSession.sessionPreset = AVCaptureSessionPresetHigh;
        }
    }
    
    return _captureSession;
}

初始化視頻音頻輸入輸出

/**
 *  設置視頻輸入
 */
- (void)setupVideo
{
    AVCaptureDevice *captureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
    
    if (!captureDevice)
    {
        NSLog(@"取得后置攝像頭時出現問題.");
        
        return;
    }
    
    NSError *error = nil;
    
    AVCaptureDeviceInput *videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
    if (error)
    {
        NSLog(@"取得設備輸入videoInput對象時出錯,錯誤原因:%@", error);
        
        return;
    }
    
    //3、將設備輸出添加到會話中
    if ([self.captureSession canAddInput:videoInput])
    {
        [self.captureSession addInput:videoInput];
    }
    
    self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    
    self.videoOutput.alwaysDiscardsLateVideoFrames = NO; //立即丟棄舊幀,節省內存,默認YES
    
    [self.videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
    
    [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
    
    if ([self.captureSession canAddOutput:self.videoOutput])
    {
        [self.captureSession addOutput:self.videoOutput];
    }
    
    AVCaptureConnection *connection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
    
    [connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    
    self.videoInput = videoInput;
}

/**
 *  設置音頻錄入
 */
- (void)setupAudio
{
    NSError *error = nil;
    AVCaptureDeviceInput *audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio] error:&error];
    if (error)
    {
        NSLog(@"取得設備輸入audioInput對象時出錯,錯誤原因:%@", error);
        
        return;
    }
    if ([self.captureSession canAddInput:audioInput])
    {
        [self.captureSession addInput:audioInput];
    }
    
    self.audioOutput = [[AVCaptureAudioDataOutput alloc] init];
    
    [self.audioOutput setSampleBufferDelegate:self queue:self.videoQueue];
    
    if([self.captureSession canAddOutput:self.audioOutput])
    {
        [self.captureSession addOutput:self.audioOutput];
    }
}

設置預覽圖層

/**
 *  設置預覽layer
 */
- (void)setupCaptureVideoPreviewLayer
{
    _captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
    
    _captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;           //填充模式
    
    [_captureVideoPreviewLayer setFrame:self.superView.bounds];
    
    [self.superView.layer addSublayer:_captureVideoPreviewLayer];
}

然后開始采集
采集的音視頻數據回調代理方法:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    
    @autoreleasepool
    {
        //視頻
        if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo])
        {
            @synchronized(self)
            {
                if (self.captureBlock) {
                    
                    self.captureBlock(sampleBuffer, AVMediaTypeVideo);
                }
            }
        }
        
        //音頻
        if (connection == [self.audioOutput connectionWithMediaType:AVMediaTypeAudio])
        {
            @synchronized(self)
            {
                if (self.captureBlock) {
                    
                    self.captureBlock(sampleBuffer, AVMediaTypeAudio);
                }
            }
        }
    }
}

這里將采集的數據傳遞出去,進行錄制處理

二、錄制音視頻

/**  寫入音視頻  */
@property (nonatomic, strong) AVAssetWriter *assetWriter;
/**  寫入視頻輸出  */
@property (nonatomic, strong) AVAssetWriterInput *assetWriterVideoInput;
/**  寫入音頻輸出  */
@property (nonatomic, strong) AVAssetWriterInput *assetWriterAudioInput;

這里使用AVAssetWriter來錄制音視頻

首先初始化AVAssetWriter

/**
 *  設置寫入視頻屬性
 */
- (void)setUpWriter
{
    if (self.videoURL == nil)
    {
        return;
    }
    
    self.assetWriter = [AVAssetWriter assetWriterWithURL:self.videoURL fileType:AVFileTypeMPEG4 error:nil];
    //寫入視頻大小
    NSInteger numPixels = kScreenWidth * kScreenHeight;
    
    //每像素比特
    CGFloat bitsPerPixel = 12.0;
    NSInteger bitsPerSecond = numPixels * bitsPerPixel;
    
    // 碼率和幀率設置
    NSDictionary *compressionProperties = @{ AVVideoAverageBitRateKey : @(bitsPerSecond),
                                             AVVideoExpectedSourceFrameRateKey : @(15),
                                             AVVideoMaxKeyFrameIntervalKey : @(15),
                                             AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel };
    CGFloat width = kScreenWidth;
    CGFloat height = kScreenHeight;
    
    //視頻屬性
    NSDictionary *videoCompressionSettings = @{ AVVideoCodecKey : AVVideoCodecTypeH264,
                                                AVVideoWidthKey : @(width * 2),
                                                AVVideoHeightKey : @(height * 2),
                                                AVVideoScalingModeKey : AVVideoScalingModeResizeAspectFill,
                                                AVVideoCompressionPropertiesKey : compressionProperties };
    
    _assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
    //expectsMediaDataInRealTime 必須設為yes,需要從capture session 實時獲取數據
    _assetWriterVideoInput.expectsMediaDataInRealTime = YES;
    
    // 音頻設置
    NSDictionary *audioCompressionSettings = @{ AVEncoderBitRatePerChannelKey : @(28000),
                                                AVFormatIDKey : @(kAudioFormatMPEG4AAC),
                                                AVNumberOfChannelsKey : @(1),
                                                AVSampleRateKey : @(22050) };
    
    _assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
    
    _assetWriterAudioInput.expectsMediaDataInRealTime = YES;
    
    if ([_assetWriter canAddInput:_assetWriterVideoInput])
    {
        [_assetWriter addInput:_assetWriterVideoInput];
    }
    else
    {
        NSLog(@"AssetWriter videoInput append Failed");
    }
    
    if ([_assetWriter canAddInput:_assetWriterAudioInput])
    {
        [_assetWriter addInput:_assetWriterAudioInput];
    }
    else
    {
        NSLog(@"AssetWriter audioInput Append Failed");
    }
    
    _canWrite = NO;
}

然后寫入數據

/**
 *  開始寫入數據
 */
- (void)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer ofMediaType:(NSString *)mediaType
{
    if (sampleBuffer == NULL)
    {
        NSLog(@"empty sampleBuffer");
        return;
    }
    
    @autoreleasepool
    {
        if (!self.canWrite && mediaType == AVMediaTypeVideo && self.assetWriter && self.assetWriter.status != AVAssetWriterStatusWriting)
        {
            
            [self.assetWriter startWriting];
            [self.assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
            self.canWrite = YES;
        }
        
        //寫入視頻數據
        if (mediaType == AVMediaTypeVideo && self.assetWriterVideoInput.readyForMoreMediaData)
        {
            if (![self.assetWriterVideoInput appendSampleBuffer:sampleBuffer])
            {
                @synchronized (self)
                {
                    [self stopVideoRecorder];
                }
            }
        }
        
        //寫入音頻數據
        if (mediaType == AVMediaTypeAudio && self.assetWriterAudioInput.readyForMoreMediaData)
        {
            if (![self.assetWriterAudioInput appendSampleBuffer:sampleBuffer])
            {
                @synchronized (self)
                {
                    [self stopVideoRecorder];
                }
            }
        }
    }
}

結束錄制后保存并預覽播放

/**
 *  結束錄制視頻
 */
- (void)stopVideoRecorder
{
    __weak __typeof(self)weakSelf = self;
    
    if(_assetWriter && _assetWriter.status == AVAssetWriterStatusWriting)
    {
        [_assetWriter finishWritingWithCompletionHandler:^{
            
            weakSelf.canWrite = NO;
            
            weakSelf.assetWriter = nil;
            
            weakSelf.assetWriterAudioInput = nil;
            
            weakSelf.assetWriterVideoInput = nil;
        }];
    }
    
    dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.3f * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
        
        [weakSelf saveVideo];
        
        [weakSelf previewVideoAfterShoot];
    });
}

三、保存并播放音視頻

在保存前,需要先將錄制的音視頻數據合成

- (void)cropWithVideoUrlStr:(NSURL *)videoUrl completion:(void (^)(NSURL *outputURL, Float64 videoDuration, BOOL isSuccess))completionHandle
{
    AVURLAsset *asset =[[AVURLAsset alloc] initWithURL:videoUrl options:nil];
    
    //獲取視頻總時長
    Float64 endTime = CMTimeGetSeconds(asset.duration);
    
    if (endTime > 10)
    {
        endTime = 10.0f;
    }
    
    Float64 startTime = 0;
    
    NSString *outputFilePath = [self createVideoFilePath];
    
    NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
    
    NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:asset];
    
    if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality])
    {
        
        AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
                                               initWithAsset:asset presetName:AVAssetExportPresetPassthrough];
        
        NSURL *outputURL = outputFileUrl;
        
        exportSession.outputURL = outputURL;
        exportSession.outputFileType = AVFileTypeMPEG4;
        exportSession.shouldOptimizeForNetworkUse = YES;
        
        CMTime start = CMTimeMakeWithSeconds(startTime, asset.duration.timescale);
        CMTime duration = CMTimeMakeWithSeconds(endTime - startTime,asset.duration.timescale);
        CMTimeRange range = CMTimeRangeMake(start, duration);
        exportSession.timeRange = range;
        
        [exportSession exportAsynchronouslyWithCompletionHandler:^{
            switch ([exportSession status]) {
                case AVAssetExportSessionStatusFailed:
                {
                    NSLog(@"合成失敗:%@", [[exportSession error] description]);
                    completionHandle(outputURL, endTime, NO);
                }
                    break;
                case AVAssetExportSessionStatusCancelled:
                {
                    completionHandle(outputURL, endTime, NO);
                }
                    break;
                case AVAssetExportSessionStatusCompleted:
                {
                    completionHandle(outputURL, endTime, YES);
                }
                    break;
                default:
                {
                    completionHandle(outputURL, endTime, NO);
                } break;
            }
        }];
    }
}

然后再將視頻保存到手機相冊

使用Photos框架保存

/**
 保存視頻
 */
- (void)saveVideo
{
    [self cropWithVideoUrlStr:self.videoURL completion:^(NSURL *videoUrl, Float64 videoDuration, BOOL isSuccess) {
        
        if (isSuccess)
        {
            NSDictionary *infoDictionary = [[NSBundle mainBundle] infoDictionary];
            
            NSString * assetCollectionName = [infoDictionary objectForKey:@"CFBundleDisplayName"];
            
            if (assetCollectionName == nil)
            {
                assetCollectionName = @"視頻相冊";
            }
            
            __block NSString *blockAssetCollectionName = assetCollectionName;
            
            __block NSURL *blockVideoUrl = videoUrl;
            
            PHPhotoLibrary *library = [PHPhotoLibrary sharedPhotoLibrary];
            
            dispatch_async(dispatch_get_main_queue(), ^{
                
                NSError *error = nil;
                __block NSString *assetId = nil;
                __block NSString *assetCollectionId = nil;
                
                // 保存視頻到【Camera Roll】(相機膠卷)
                [library performChangesAndWait:^{
                    
                    assetId = [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:blockVideoUrl].placeholderForCreatedAsset.localIdentifier;
                    
                } error:&error];
                
                NSLog(@"error1: %@", error);
                
                // 獲取曾經創建過的自定義視頻相冊名字
                PHAssetCollection *createdAssetCollection = nil;
                PHFetchResult <PHAssetCollection*> *assetCollections = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil];
                for (PHAssetCollection *assetCollection in assetCollections)
                {
                    if ([assetCollection.localizedTitle isEqualToString:blockAssetCollectionName])
                    {
                        createdAssetCollection = assetCollection;
                        break;
                    }
                }
                
                //如果這個自定義框架沒有創建過
                if (createdAssetCollection == nil)
                {
                    //創建新的[自定義的 Album](相簿\相冊)
                    [library performChangesAndWait:^{
                        
                        assetCollectionId = [PHAssetCollectionChangeRequest creationRequestForAssetCollectionWithTitle:blockAssetCollectionName].placeholderForCreatedAssetCollection.localIdentifier;
                        
                    } error:&error];
                    
                    NSLog(@"error2: %@", error);
                    
                    //抓取剛創建完的視頻相冊對象
                    createdAssetCollection = [PHAssetCollection fetchAssetCollectionsWithLocalIdentifiers:@[assetCollectionId] options:nil].firstObject;
                    
                }
                
                // 將【Camera Roll】(相機膠卷)的視頻 添加到【自定義Album】(相簿\相冊)中
                [library performChangesAndWait:^{
                    PHAssetCollectionChangeRequest *request = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:createdAssetCollection];
                    
                    [request addAssets:[PHAsset fetchAssetsWithLocalIdentifiers:@[assetId] options:nil]];
                    
                } error:&error];
                
                NSLog(@"error3: %@", error);
                
            });
        }
        else
        {
            NSLog(@"保存視頻失敗!");
            
            [[NSFileManager defaultManager] removeItemAtURL:self.videoURL error:nil];
            
            self.videoURL = nil;
            
            [[NSFileManager defaultManager] removeItemAtURL:videoUrl error:nil];
        }
    }];
}

然后就可以去播放了

這里使用AVPlayer來播放

/**  視頻預覽View  */
@property (strong, nonatomic) UIView *videoPreviewContainerView;
/**  播放器  */
@property (strong, nonatomic) AVPlayer *player;
- (void)previewVideoAfterShoot
{
    if (self.videoURL == nil || self.videoPreviewContainerView != nil)
    {
        return;
    }
    
    AVURLAsset *asset = [AVURLAsset assetWithURL:self.videoURL];
    
    // 初始化AVPlayer
    self.videoPreviewContainerView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, kScreenWidth, kScreenHeight)];
    
    self.videoPreviewContainerView.backgroundColor = [UIColor blackColor];
    
    AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:asset];
    
    self.player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
    
    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    
    playerLayer.frame = CGRectMake(0, 0, kScreenWidth, kScreenHeight);
    
    playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;
    
    [self.videoPreviewContainerView.layer addSublayer:playerLayer];
    
    // 其余UI布局設置
    [self.view addSubview:self.videoPreviewContainerView];
    [self.view bringSubviewToFront:self.videoPreviewContainerView];
    
    // 重復播放預覽視頻
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playVideoFinished:) name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];
    
    // 開始播放
    [self.player play];
}

Github代碼地址

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。

推薦閱讀更多精彩內容