【iOS】視頻錄像相關功能調研(一)

1、實現視頻錄像的幾種方式

  • UIImagePickerController
  • AVCaptureSession + AVCaptureMovieFileOutput
  • AVCaptureSession + AVAssetWriter

UIImagePickerController系統封裝好的UI,直接輸出視頻文件
AVCaptureSession + AVCaptureMovieFileOutput 支持自定義UI,直接輸出視頻文件
AVCaptureSession + AVAssetWriter支持自定義UI,輸出的是視頻幀和音頻幀,需要自己處理,拼接成視頻文件

2、系統封裝好的 UIImagePickerController

2.1 UIImagePickerController方式

是目前集成相機最簡單的方式,但是不知道自定義相機,這是一個封裝了完整視頻捕獲管線和相機 UI 的 view controller。

2.1.1 info.plist 設置

Privacy - Microphone Usage Description 是否允許設備調用您的麥克風?
Privacy - Camera Usage Description 是否允許設備調用您的相機?

2.1.2 是否支持相機錄制

在實例化相機之前,首先要檢查設備是否支持相機錄制:

        /// 判斷是否支持錄像
        if UIImagePickerController.isSourceTypeAvailable(UIImagePickerController.SourceType.camera) {
            if let availableMediaTypes = UIImagePickerController.availableMediaTypes(for: UIImagePickerController.SourceType.camera) {
                if !availableMediaTypes.contains("public.movie") {
                    print("不支持錄像")
                    return
                }
            }
        }

2.1.3 權限確認

視頻權限和音頻權限確認

        /// 視頻權限
        AVCaptureDevice.requestAccess(for: AVMediaType.video) {[weak self] (granted) in
            guard let weakSelf = self else {
                return
            }
            if !granted {
                print("無權限訪問相機")
                return
            }
            
            // 錄音權限
            AVCaptureDevice.requestAccess(for: AVMediaType.audio) {[weak self] (granted) in
                guard let weakSelf = self else {
                    return
                }
                
                if !granted {
                    print("無權限訪問麥克風")
                    return
                }
                // 進入錄像頁面
                weakSelf.present(weakSelf.pickerController, animated: true, completion: nil)
            }
        }

2.1.4 創建UIImagePickerController 對象

然后創建一個 UIImagePickerController 對象,設置好代理便于進一步處理錄制好的視頻 (比如存到相冊) 以及對于用戶關閉相機作出響應:

    lazy var pickerController: UIImagePickerController = {
        let pickerController = UIImagePickerController()
        // 設置圖像選取控制器的來源模式為相機模式 相機、相冊
        pickerController.sourceType = UIImagePickerController.SourceType.camera
        // 設置相機的類型 public.image  public.movie
        pickerController.mediaTypes = ["public.movie",]
        // 設置攝像頭 前、后
        pickerController.cameraDevice = UIImagePickerController.CameraDevice.rear;
        // 設置攝像頭閃光燈模式
        // pickerController.cameraFlashMode = UIImagePickerController.CameraFlashMode.auto
        // 設置攝像圖像品質
        pickerController.videoQuality = UIImagePickerController.QualityType.typeHigh
        // 設置最長攝像時間
        pickerController.videoMaximumDuration = 30
        // 允許用戶進行編輯
        pickerController.allowsEditing = false
        // 設置委托對象
        pickerController.delegate = self
        return pickerController
    }()

2.1.5 UIImagePickerControllerDelegate代理的實現

extension RecordVideoViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate
{
    func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
        let mediaType = info[UIImagePickerController.InfoKey.mediaType] as! String

        if mediaType == "public.movie" {
            // 獲取視頻文件的url
            let mediaURL = info[UIImagePickerController.InfoKey.mediaURL] as! NSURL
            // 視頻文件的地址
            let pathString = mediaURL.relativePath
            print("視頻地址:" + pathString!)
            
            DispatchQueue.global().async {
                //判斷能不能保存到相簿
                if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(pathString!)) {
                    //保存視頻到相簿
                    UISaveVideoAtPathToSavedPhotosAlbum(pathString!, self,  #selector(self.saveVideo(videoPath:didFinishSavingWithError:contextInfo:)), nil)
                }
                DispatchQueue.main.async {
                    picker.dismiss(animated: true, completion: nil)
                }
            }
            
        }
    }
    
    @objc private func saveVideo(videoPath:String, didFinishSavingWithError error: NSError?, contextInfo: AnyObject) {
        if error != nil{
            print("保存視頻 失敗")
        }else{
            print("保存視頻 成功")
        }
    }
}

3、自定義相機 AVFoundation

AVFoundation 中關于視頻捕獲的主要的類是 AVCaptureSession。它負責調配影音輸入與輸出之間的數據流

3.1 AVCaptureSession + AVCaptureMovieFileOutput方式

3.1.1 info.plist 設置

Privacy - Microphone Usage Description 是否允許設備調用您的麥克風?
Privacy - Camera Usage Description 是否允許設備調用您的相機?

3.1.2 權限確認

視頻權限和音頻權限確認

        /// 視頻權限
        AVCaptureDevice.requestAccess(for: AVMediaType.video) {[weak self] (granted) in
            guard let weakSelf = self else {
                return
            }
            if !granted {
                print("無權限訪問相機")
                return
            }
            
            // 錄音權限
            AVCaptureDevice.requestAccess(for: AVMediaType.audio) {[weak self] (granted) in
                guard let weakSelf = self else {
                    return
                }
                
                if !granted {
                    print("無權限訪問麥克風")
                    return
                }
                
            }
        }

3.1.3 創建AVCaptureSession

使用一個 capture session,你需要先實例化,添加輸入與輸出,設置分辨率,接著啟動從輸入到輸出之間的數據流:

    /// 視頻捕獲會話
    let captureSession = AVCaptureSession()

3.1.4 添加視頻輸入設備

    //MARK: 添加視頻輸入設備
    func addInputVideo() {
        self.captureSession.beginConfiguration()

        let videoDevice = AVCaptureDevice.default(for: AVMediaType.video)!
        let videoInput = try? AVCaptureDeviceInput(device: videoDevice)
        
        if self.captureSession.canAddInput(videoInput!) {
            self.captureSession.addInput(videoInput!)
        }
        
        self.captureSession.commitConfiguration()
    }

3.1.5 添加音頻輸入設備

    //MARK: 添加音頻輸入設備
    func addInputAudio() {
        self.captureSession.beginConfiguration()
        
        let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)
        let audioInput = try? AVCaptureDeviceInput(device: audioDevice!)
        
        if self.captureSession.canAddInput(audioInput!) {
            self.captureSession.addInput(audioInput!);
        }
        self.captureSession.commitConfiguration()
    }

3.1.6 設置分辨率

    //MARK: 設置分辨率
    func setPreset() {
        self.captureSession.beginConfiguration()
        if self.captureSession.canSetSessionPreset(AVCaptureSession.Preset.hd1280x720) {
            self.captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
        }
        self.captureSession.commitConfiguration()
    }
    

3.1.7 設置輸出

    //MARK: 設置輸出
    func setOutput() {
        self.captureSession.beginConfiguration()
        
        if let captureConnection = self.fileOutput.connection(with: AVMediaType.video) {
            // 防止抖動
            if captureConnection.isVideoStabilizationSupported {
                captureConnection.preferredVideoStabilizationMode = .auto
            }
            // 預覽圖層和視頻方向保存一直
            captureConnection.videoOrientation = (self.videoLayer.connection?.videoOrientation)!
        }
        // 視頻時長默認10秒,此設置不受限制
        self.fileOutput.movieFragmentInterval = CMTime.invalid
        if  self.captureSession.canAddOutput(self.fileOutput) {
            self.captureSession.addOutput(self.fileOutput)
        }
        self.captureSession.commitConfiguration()
    }

3.1.8 設置顯示采集畫面,并開始采集

    /// 攝像頭采集畫面
    lazy var videoLayer: AVCaptureVideoPreviewLayer = {
        let videoLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
        videoLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
        videoLayer.masksToBounds = true
        return videoLayer
    }()
    //使用AVCaptureVideoPreviewLayer可以將攝像頭的拍攝的實時畫面顯示在ViewController上
        DispatchQueue.main.async {
            weakSelf.videoLayer.frame = weakSelf.view.bounds
            weakSelf.view.layer.addSublayer(weakSelf.videoLayer)
            weakSelf.captureSession.startRunning()
            //創建按鈕
            weakSelf.setUI()
        }

3.1.9 開始錄制

    //MARK: 開始錄制
    @objc func starRecordVideo() {
        
        if  !self.isRecording {
            //設置錄像的保存地址
            let filePath = self.getNewPath(videoTyle: AVFileType.mp4)
            let fileURL = URL(fileURLWithPath: filePath)
            //啟動視頻編碼輸出
            fileOutput.startRecording(to:fileURL, recordingDelegate:  self )
            
            //記錄狀態:錄像中...
            self.isRecording = true
            //開始、結束按鈕顏色改變
            self.starButton.backgroundColor = UIColor.lightGray
            self.starButton.isEnabled = false
            
            self.stopButton.backgroundColor = UIColor.blue
            self.stopButton.isEnabled = true
        }
    }

3.1.10 結束錄制

    //MARK: 結束錄制
    @objc func stopRecordVideo() {
        if self.isRecording {
            //停止視頻編碼輸出
            fileOutput.stopRecording()
            
            //記錄狀態:錄像結束
            self .isRecording =  false
            
            //開始、結束按鈕顏色改變
            self.starButton.backgroundColor = UIColor.red
            self.starButton.isEnabled = true
            
            self.stopButton.backgroundColor = UIColor.lightGray
            self.stopButton.isEnabled = false
        }
    }

3.1.11 AVCaptureFileOutputRecordingDelegate

//MARK: AVCaptureFileOutputRecordingDelegate
extension RecordVideo2ViewController:AVCaptureFileOutputRecordingDelegate
{
    // 開始錄制
    func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
        
    }
    // 結束錄制
    func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
        // 獲取視頻文件大小
        self.getVideoSize(videoUrl: outputFileURL)
        // 獲取視頻文件時長
        self.getVideoLength(videoUrl: outputFileURL)
        // 保存相冊一份,便于測試
        self.saveVideoToAlbum(videoUrl: outputFileURL)
        // 獲取指定時間的幀
        self.getImage(videoUrl: outputFileURL, cmtime: CMTimeMake(value: 1, timescale: 1), width: 300)
        
        // 壓縮視頻
        let newPath = self.getNewPath(videoTyle: AVFileType.mov)
        print(newPath)
        self.convertVideo(inputURL: outputFileURL, outputURL: URL(fileURLWithPath: newPath), presetName: AVAssetExportPresetMediumQuality) { (success) in
            if success {
                print("壓縮成功")
            }else{
                print("壓縮失敗")
            }
        }
    }
}

3.2 AVCaptureSession + AVAssetWriter方式

對視頻進一步了解,可以使用AVCaptureVideoDataOutputAVCaptureAudioDataOutput來會各自捕獲視頻和音頻的樣本緩存,而不是AVCaptureMovieFileOutpu
接著我們可以使用他們的代理AVCaptureVideoDataOutputSampleBufferDelegateAVCaptureAudioDataOutputSampleBufferDelegate,可以對采樣緩沖進行處理 (比如給視頻加濾鏡),或者保持原樣傳送。
然后使用 AVAssetWriter 對象可以將樣本緩存寫入文件

3.2.1 info.plist 設置

Privacy - Microphone Usage Description 是否允許設備調用您的麥克風?
Privacy - Camera Usage Description 是否允許設備調用您的相機?

3.2.2 權限確認

視頻權限和音頻權限確認

        /// 視頻權限
        AVCaptureDevice.requestAccess(for: AVMediaType.video) {[weak self] (granted) in
            guard let weakSelf = self else {
                return
            }
            if !granted {
                print("無權限訪問相機")
                return
            }
            
            // 錄音權限
            AVCaptureDevice.requestAccess(for: AVMediaType.audio) {[weak self] (granted) in
                guard let weakSelf = self else {
                    return
                }
                
                if !granted {
                    print("無權限訪問麥克風")
                    return
                }
                
            }
        }

3.2.3 創建AVCaptureSession

使用一個 capture session,你需要先實例化,添加輸入與輸出,設置分辨率,接著啟動從輸入到輸出之間的數據流:

    /// 視頻捕獲會話
    let captureSession = AVCaptureSession()

3.2.4 添加視頻輸入設備

    //MARK: 添加視頻輸入設備
    func addInputVideo() {
        self.captureSession.beginConfiguration()

        let videoDevice = AVCaptureDevice.default(for: AVMediaType.video)!
        let videoInput = try? AVCaptureDeviceInput(device: videoDevice)
        
        if self.captureSession.canAddInput(videoInput!) {
            self.captureSession.addInput(videoInput!)
        }
        
        self.captureSession.commitConfiguration()
    }

3.2.5 添加音頻輸入設備

    //MARK: 添加音頻輸入設備
    func addInputAudio() {
        self.captureSession.beginConfiguration()
        
        let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)
        let audioInput = try? AVCaptureDeviceInput(device: audioDevice!)
        
        if self.captureSession.canAddInput(audioInput!) {
            self.captureSession.addInput(audioInput!);
        }
        self.captureSession.commitConfiguration()
    }

3.2.6 設置分辨率

    //MARK: 設置分辨率
    func setPreset() {
        self.captureSession.beginConfiguration()
        if self.captureSession.canSetSessionPreset(AVCaptureSession.Preset.hd1280x720) {
            self.captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
        }
        self.captureSession.commitConfiguration()
    }
    

3.2.7 添加視頻輸出、音頻輸出

    //MARK: 添加視頻輸出、音頻輸出
    func addOutputVideoAndAudio() {
        self.captureSession.beginConfiguration()
        
        self.videoDataOutput.setSampleBufferDelegate(self, queue: self.sessionQueue!)
        if self.captureSession.canAddOutput(self.videoDataOutput) {
            self.captureSession.addOutput(self.videoDataOutput)
        }
        
        self.audioDataOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)
        if self.captureSession.canAddOutput(self.audioDataOutput) {
            self.captureSession.addOutput(self.audioDataOutput)
        }
        
        self.captureSession.commitConfiguration()
    }

3.2.8 設置顯示采集畫面,并開始采集

    /// 攝像頭采集畫面
    lazy var videoLayer: AVCaptureVideoPreviewLayer = {
        let videoLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
        videoLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
        videoLayer.masksToBounds = true
        return videoLayer
    }()
    //使用AVCaptureVideoPreviewLayer可以將攝像頭的拍攝的實時畫面顯示在ViewController上
        DispatchQueue.main.async {
            weakSelf.videoLayer.frame = weakSelf.view.bounds
            weakSelf.view.layer.addSublayer(weakSelf.videoLayer)
            weakSelf.captureSession.startRunning()
            //創建按鈕
            weakSelf.setUI()
        }

3.2.9 開始錄制

    //MARK: 開始錄制
    @objc func starRecordVideo() {
        
        if  !self.isRecording {
            print("開始錄制")
            //設置錄像的保存地址
            self.videoPath = self.getNewPath(videoTyle: AVFileType.mp4)
            print(self.videoPath)
            setAssetWriter(videoPath: self.videoPath)
            
            //記錄狀態:錄像中...
            self.isRecording = true
            //開始、結束按鈕顏色改變
            self.starButton.backgroundColor = UIColor.lightGray
            self.starButton.isEnabled = false
            
            self.stopButton.backgroundColor = UIColor.blue
            self.stopButton.isEnabled = true
        }
        
    }

3.2.10 設置AVAssetWrite

    //MARK: 設置AssetWrite
    func setAssetWriter(videoPath: String) {
        //設置錄像的保存地址
        let fileURL = URL(fileURLWithPath: videoPath)
        
        if let assetWriter = try? AVAssetWriter.init(url: fileURL, fileType: AVFileType.mp4) {
            self.assetWriter = assetWriter
              
            var width = UIScreen.main.bounds.size.height
            
            var height = UIScreen.main.bounds.size.width
            //寫入視頻大小
            let numPixels = width*height
            //每像素比特
            let bitsPerPixel:CGFloat = 12.0;
            let bitsPerSecond = numPixels * bitsPerPixel;
            if (false) // 是否是劉海屏
            {
                width = UIScreen.main.bounds.size.height - 146;
                height = UIScreen.main.bounds.size.width;
            }
                
            let compressionProperties = [
                // 視頻尺寸*比率,10.1相當于AVCaptureSessionPresetHigh,數值越大,顯示越精細
                AVVideoAverageBitRateKey : bitsPerSecond,
                // 設置輸出幀率
                AVVideoExpectedSourceFrameRateKey : 15,
                // 關鍵幀最大間隔,1為每個都是關鍵幀,數值越大壓縮率越高
                AVVideoMaxKeyFrameIntervalKey : 15,
                // 畫面質量
                AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel
            ] as [String : Any]
            
            let videoCompressionSettings = [
                AVVideoCodecKey : AVVideoCodecH264,
                AVVideoWidthKey : width * 2,
                AVVideoHeightKey : height * 2,
//                AVVideoScalingModeKey : AVVideoScalingModeResizeAspectFill,
                AVVideoCompressionPropertiesKey : compressionProperties
            ] as [String : Any]
            
            self.assetWriterVideoInput = AVAssetWriterInput.init(mediaType: AVMediaType.video, outputSettings: videoCompressionSettings)
            //expectsMediaDataInRealTime 必須設為yes,需要從capture session 實時獲取數據
            self.assetWriterVideoInput?.expectsMediaDataInRealTime = true
            
            // 音頻設置
            let audioCompressionSettings = [
                // 每個聲道的比特率
                AVEncoderBitRatePerChannelKey : 28000,
                // 設置錄音格式
                AVFormatIDKey : kAudioFormatMPEG4AAC,
                // 設置通道,單聲道,雙聲道  mp3 必須雙聲道
                AVNumberOfChannelsKey : 1,
                // 設置錄音采樣率,8000是電話采樣率,對于一般錄音已經夠了
                AVSampleRateKey : 22050,
                // 每個采樣點位數,分為8、16、24、32
                AVLinearPCMBitDepthKey: 16,
                // 質量
                AVEncoderAudioQualityKey: AVAudioQuality.medium
            ] as [String : Any]
            
            self.assetWriterAudioInput = AVAssetWriterInput.init(mediaType: AVMediaType.audio, outputSettings: audioCompressionSettings)
            self.assetWriterAudioInput?.expectsMediaDataInRealTime = true
            if self.assetWriter!.canAdd(self.assetWriterVideoInput!) {
                self.assetWriter?.add(self.assetWriterVideoInput!)
            }
            
            if self.assetWriter!.canAdd(self.assetWriterAudioInput!) {
                self.assetWriter?.add(self.assetWriterAudioInput!)
            }
            
            self.canWrite = false
            
        } else {
            print("加載AVAssetWriter失敗")
        }
    }

3.2.12 視頻和音頻每一幀處理

extension RecordVideo3ViewController: AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate
{
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {        
        autoreleasepool {
            if !isRecording {
                return
            }
            if connection == self.videoDataOutput.connection(with: AVMediaType.video) {
                objc_sync_enter(self)
                self.appendSampleBuffer(sampleBuffer: sampleBuffer, mediaType: AVMediaType.video)
                objc_sync_exit(self)
            }
            if connection == self.audioDataOutput.connection(with: AVMediaType.audio) {
                objc_sync_enter(self)
                self.appendSampleBuffer(sampleBuffer: sampleBuffer, mediaType: AVMediaType.audio)
                objc_sync_exit(self)
            }
        }
    }
    func appendSampleBuffer(sampleBuffer:CMSampleBuffer, mediaType:AVMediaType) {
        autoreleasepool {
            if (!self.canWrite && mediaType == AVMediaType.video) {
                print("開始寫入AVAssetWriter")
                self.assetWriter?.startWriting()
                self.assetWriter?.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
                self.canWrite = true
            }
            
            //寫入視頻數據
            if mediaType == AVMediaType.video {
                if self.assetWriterVideoInput!.isReadyForMoreMediaData {
                    let success = self.assetWriterVideoInput!.append(sampleBuffer)
                    if !success {
                        //停止錄像
                        objc_sync_enter(self)
                        self.stopRecordVideo()
                        objc_sync_exit(self)
                    }
                }
            }
            
            //寫入視頻數據
            if mediaType == AVMediaType.audio {
                if self.assetWriterAudioInput!.isReadyForMoreMediaData {
                    let success = self.assetWriterAudioInput!.append(sampleBuffer)
                    if !success {
                        //停止錄像
                        objc_sync_enter(self)
                        self.stopRecordVideo()
                        objc_sync_exit(self)
                    }
                }
            }
        }
    }
}

3.2.13 結束錄制

    //MARK: 結束錄制
    @objc func stopRecordVideo() {
        if self.isRecording {
            print("結束錄制")
            //停止視頻編碼輸出
            if self.assetWriter != nil && self.assetWriter?.status == AVAssetWriter.Status.writing {
                self.assetWriter?.finishWriting { [weak self] in
                    guard let weakSelf = self else {
                        return
                    }
                    weakSelf.canWrite = false
                    weakSelf.assetWriter = nil
                    weakSelf.assetWriterAudioInput = nil
                    weakSelf.assetWriterVideoInput = nil
                }
            }
            
            //記錄狀態:錄像結束
            self .isRecording =  false
            
            //開始、結束按鈕顏色改變
            self.starButton.backgroundColor = UIColor.red
            self.starButton.isEnabled = true
            
            self.stopButton.backgroundColor = UIColor.lightGray
            self.stopButton.isEnabled = false
        }
    }
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。

推薦閱讀更多精彩內容