本文學習視頻的壓縮成,以及原本視頻中的音頻和需要合成的音頻的不同比例混合。
- 大體思路 先創建新路徑放臨時壓縮視頻->壓縮視頻->合成音視頻
1 創建新路徑
self.tmpVideoPath = [tempDir stringByAppendingPathComponent:@"myMovie.mp4"]; ```
2 看新路徑存不存在 若之前存在則刪除
`NSURL *url = [NSURL fileURLWithPath:self.tmpVideoPath];`
`NSFileManager *fm = [NSFileManager defaultManager];`
`BOOL exist = [fm fileExistsAtPath:url.path];`
`if (exist) {`
[fm removeItemAtURL:url error:&err];
NSLog(@"file deleted");
if (err) {
NSLog(@"file remove error, %@", err.localizedDescription );
}
` } else {`
NSLog(@"no file by that name");
` } `
3 壓縮視頻
> 壓縮前先了解 AVAsset是一個抽象類和不可變類,定義了媒體資源混合呈現的方式,通過asetWithURL方法進行創建時,實際上是創建了它子類AVUrlAsset的一個實例,而AVAsset是一個抽象類,不能直接被實例化。
通過AVUrlAsset我們可以創建一個帶選項(可選)的資產,以提供更精確的時長和計時信息
可以使用 exportPresetsCompatibleWithAsset: 方法檢查是否可以使用某個Preset.
``` NSURL *videoUrl = [NSURL fileURLWithPath:videoUrlString];
self.asset = [AVAsset assetWithURL:videoUrl];
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
// 所支持的壓縮格式中是否有 所選的壓縮格式
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {```
self.exportSession = [[AVAssetExportSession alloc]initWithAsset:self.asset presetName:AVAssetExportPresetPassthrough];
NSURL *furl = [NSURL fileURLWithPath:self.tmpVideoPath];
self.exportSession.outputURL = furl;
//文件用.mp4擴展名標識。
self.exportSession.outputFileType = AVFileTypeMPEG4;
//設置剪裁時間
CMTime start = CMTimeMakeWithSeconds(self.startTime, self.asset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime - self.startTime, self.asset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
self.exportSession.timeRange = range;
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(@"Export failed: %@", [[self.exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
break;
default:
//壓縮成功
dispatch_async(dispatch_get_main_queue(), ^{
//混合音頻百分比
CGFloat progress = _smarkSlider.value/100;
//1-progress 是原視頻的所占百分比 progress是要插入音頻的所占百分比
[self addVideoVolumn:1-progress andAudioVolumn:progress];
// //創建一個消息對象
// NSNotification * notice = [NSNotification notificationWithName:@"PUSH" object:nil userInfo:nil];
// //發送消息
// [[NSNotificationCenter defaultCenter]postNotification:notice];
});
break;
}
}]; }
4 原視頻音頻和要插入音頻百分比混合成新的音頻
```/*
* 抽取原視頻的音頻與需要的音樂混合
*/
-(void)addVideoVolumn:(CGFloat)volumnVideo andAudioVolumn:(CGFloat)volumnAudio
{
AVMutableComposition *composition =[AVMutableComposition composition];
//NSMutableArray *audioMixParams; 混合音數組
_audioMixParams =[[NSMutableArray alloc]initWithCapacity:0];
// 錄制的視頻(剛才壓縮的視頻)
NSURL *video_inputFileUrl =[[NSURL alloc] initFileURLWithPath:self.tmpVideoPath];
AVURLAsset *songAsset =[AVURLAsset URLAssetWithURL:video_inputFileUrl options:nil];
CMTime startTime = CMTimeMakeWithSeconds(0, songAsset.duration.timescale);
CMTime trackDuration = songAsset.duration;
// 獲取視頻中的音頻素材
[self setUpAndAddAudioAtPath:video_inputFileUrl toComposition:composition start:startTime dura:trackDuration offset:CMTimeMake(0,44100) andVolume:volumnVideo];
// 本地要插入的音樂
NSURL *url = [[NSURL alloc]initFileURLWithPath:[[AudioDao getCurrentWorkDataByWorkdate:_model.audioTime].newfile getFilePathOfDocuments]];//根據時間獲取音頻的路徑 ##**********
// 獲取設置完的藍夾子音素材
AVURLAsset *songAsset1 =[AVURLAsset URLAssetWithURL:url options:nil];
CMTime startTime1 = CMTimeMakeWithSeconds(0.06 + self.startTime, songAsset1.duration.timescale);// 設置。 1.1
CMTime trackDuration1 = songAsset1.duration;
[self setUpAndAddAudioAtPath:url toComposition:composition start:startTime1 dura:trackDuration1 offset:CMTimeMake(0,44100) andVolume:volumnAudio];
// 創建一個可變的音頻混合
AVMutableAudioMix *audioMix =[AVMutableAudioMix audioMix];
audioMix.inputParameters =[NSArray arrayWithArray:_audioMixParams];//從數組里取出處理后的音頻軌道參數
// 創建一個輸出
AVAssetExportSession *exporter =[[AVAssetExportSession alloc]
initWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
exporter.audioMix = audioMix;
exporter.outputFileType= AVFileTypeAppleM4A;// @"com.apple.m4a-audio"
NSString* fileName =[NSString stringWithFormat:@"%@.mov",@"overMix"];// 不能改格式
// 輸出路徑
NSString *exportFile =[NSString stringWithFormat:@"%@/%@",[self getLibarayPath], fileName];
if([[NSFileManager defaultManager]fileExistsAtPath:exportFile]) {
[[NSFileManager defaultManager]removeItemAtPath:exportFile error:nil];
}
NSURL *exportURL =[NSURL fileURLWithPath:exportFile];
exporter.outputURL = exportURL;
self.mixURL = exportURL;
__weak typeof(self) weakSelf = self;
[exporter exportAsynchronouslyWithCompletionHandler:^{
int exportStatus =(int)exporter.status;
switch (exportStatus){
case AVAssetExportSessionStatusFailed:{
NSError *exportError = exporter.error;
NSLog(@"錯誤,信息: %@", exportError);
break;
}
case AVAssetExportSessionStatusCompleted:{
NSLog(@"成功 是否在主線程2%d",(int)[NSThread isMainThread]);
// 最終混合
[weakSelf theVideoWithMixMusic];
break;
}
case AVAssetExportSessionStatusExporting:{
NSLog(@"當前壓縮進度:%f",exporter.progress);
break;
}
}
}];
}```
```/*
* 通過文件路徑建立和添加音頻素材
*/
- (void)setUpAndAddAudioAtPath:(NSURL*)assetURL toComposition:(AVMutableComposition*)composition start:(CMTime)start dura:(CMTime)dura offset:(CMTime)offset andVolume:(float)volumn{
AVURLAsset *songAsset =[AVURLAsset URLAssetWithURL:assetURL options:nil];
AVMutableCompositionTrack *track =[composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *sourceAudioTrack =[[songAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0];
NSError *error =nil;
BOOL ok = NO;
CMTime startTime = start;
CMTime trackDuration = dura;
CMTimeRange tRange = CMTimeRangeMake(startTime,trackDuration);
// 設置音量 設置從指定時間開始的音頻音量的值。
//AVMutableAudioMixInputParameters(輸入參數可變的音頻混合)
//audioMixInputParametersWithTrack(音頻混音輸入參數與軌道)
AVMutableAudioMixInputParameters *trackMix =[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[trackMix setVolume:volumn atTime:startTime];
// 素材加入數組
[_audioMixParams addObject:trackMix];
// Insert audio into track //offsetCMTimeMake(0, 44100)
ok = [track insertTimeRange:tRange ofTrack:sourceAudioTrack atTime:offset error:&error];
}```
5 把視頻和混合的音頻合成
-(void)theVideoWithMixMusic
{
// 聲音來源路徑(最終混合的音頻)
NSURL *audio_inputFileUrl = self.mixURL;
// 視頻來源路徑
NSURL *video_inputFileUrl = [NSURL fileURLWithPath:self.tmpVideoPath];
// 最終合成輸出路徑
_outputFilePath =[documentsDirectory stringByAppendingPathComponent:@"finalvideo.mp4"];
NSURL
*outputFileUrl = [NSURL fileURLWithPath:_outputFilePath];
if([[NSFileManager defaultManager]fileExistsAtPath:_outputFilePath])
[[NSFileManager defaultManager]removeItemAtPath:_outputFilePath error:nil];
CMTime nextClipStartTime =kCMTimeZero;
// 創建可變的音頻視頻組合
AVMutableComposition* mixComposition =[AVMutableComposition composition];
// 視頻采集
AVURLAsset* videoAsset =[[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo]objectAtIndex:0] atTime:nextClipStartTime error:nil];
// 聲音采集
AVURLAsset* audioAsset =[[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
CMTimeRange audio_timeRange =CMTimeRangeMake(kCMTimeZero,videoAsset.duration);//聲音長度截取范圍
AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0]atTime:nextClipStartTime error:nil];
AVMutableVideoComposition *mainComposition = [AVMutableVideoComposition videoComposition];
mainComposition.frameDuration = CMTimeMake(1, 30);
NSInteger videoRoate = [self.class degressFromVideoFileWithURL:video_inputFileUrl];
CGAffineTransform translateToCenter;
// if(videoRoate!=0){
CGAffineTransform mixedTransform;
if(videoRoate == 0){
translateToCenter = CGAffineTransformMakeTranslation(0.0,0.0);
mixedTransform = CGAffineTransformRotate(translateToCenter,0);
mainComposition.renderSize = CGSizeMake(a_compositionVideoTrack.naturalSize.width,a_compositionVideoTrack.naturalSize.height);
}
else if(videoRoate == 90){
//順時針旋轉90°
NSLog(@"視頻旋轉90度,home按鍵在左");
translateToCenter = CGAffineTransformMakeTranslation(a_compositionVideoTrack.naturalSize.height,0.0);
mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI_2);
mainComposition.renderSize = CGSizeMake(a_compositionVideoTrack.naturalSize.height,a_compositionVideoTrack.naturalSize.width);
}else if(videoRoate == 180){
//順時針旋轉180°
NSLog(@"視頻旋轉180度,home按鍵在上");
translateToCenter = CGAffineTransformMakeTranslation(a_compositionVideoTrack.naturalSize.width, a_compositionVideoTrack.naturalSize.height);
mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI);
mainComposition.renderSize = CGSizeMake(a_compositionVideoTrack.naturalSize.width,a_compositionVideoTrack.naturalSize.height);
}else if(videoRoate == 270){
//順時針旋轉270°
NSLog(@"視頻旋轉270度,home按鍵在右");
translateToCenter = CGAffineTransformMakeTranslation(0.0, a_compositionVideoTrack.naturalSize.width);
mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI_2*3.0);
mainComposition.renderSize = CGSizeMake(a_compositionVideoTrack.naturalSize.height,a_compositionVideoTrack.naturalSize.width);
}
AVMutableVideoCompositionInstruction *roateInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
roateInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
AVMutableVideoCompositionLayerInstruction *roateLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:a_compositionVideoTrack];
[roateLayerInstruction setTransform:mixedTransform atTime:kCMTimeZero];
roateInstruction.layerInstructions = @[roateLayerInstruction];
//將視頻方向旋轉加入到視頻處理中
mainComposition.instructions = @[roateInstruction];
// }
// 創建一個輸出
_assetExport =[[AVAssetExportSession alloc]initWithAsset:mixComposition presetName:_exportSet];
_assetExport.outputFileType =AVFileTypeMPEG4;
_assetExport.outputURL = outputFileUrl;
_assetExport.shouldOptimizeForNetworkUse=YES;
_assetExport.videoComposition = mainComposition;
__weak typeof(self) weakSelf = self;
// NSTimer *processTimer = [NSTimer scheduledTimerWithTimeInterval: 0.1 target: self selector: @selector(exportingProgressDicChanged) userInfo: nil repeats: YES];
//
// [processTimer fire];
// 獲得隊列
// dispatch_queue_t queue = dispatch_get_global_queue(0, 0);
dispatch_queue_t queue =dispatch_get_main_queue();
// 創建一個定時器(dispatch_source_t本質還是個OC對象)
[self setProgressView];
_timer=dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER,0, 0, queue);
dispatch_time_t start =dispatch_time(DISPATCH_TIME_NOW, (int64_t)(1.0 * NSEC_PER_SEC));
uint64_t interval = (uint64_t)(0.1 *NSEC_PER_SEC);
dispatch_source_set_timer(_timer, start, interval,0);
// 設置回調
dispatch_source_set_event_handler(_timer, ^{
[self exportingProgressDicChanged];
});
dispatch_resume(_timer);
// dispatch_async(queue, ^{
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
// if ([_assetExport status] == AVAssetExportSessionStatusCompleted) {
dispatch_async(dispatch_get_main_queue(), ^{
[LCProgressHUD showMessage:@"合成完成"];
});
}
];
// });
NSLog(@"完成!輸出路徑==%@",_outputFilePath);
}```
其實有很多概念 屬性我都不是很確定是做什么的 所以這邊我單拿出來 解釋一下
- CMTime start = CMTimeMakeWithSeconds(self.startTime, self.asset.duration.timescale);
https://depthlove.github.io/2016/03/15/CMTime-CMTimeMake-CMTimeMakeWithSeconds-in-iOS-encode/ 寫的很詳細
CMTimeMake(a,b) a當前第幾幀, b每秒鐘多少幀.當前播放時間a/b
CMTimeMakeWithSeconds(a,b) a當前時間,b每秒鐘多少幀. - AVAsset和AVUrlAsset的區別
AVURLAsset
是用于AVAsset
從本地或遠程URL初始化資產的具體子類。繼承自
AVAsset
- 碼率:通俗一點的理解就是取樣率,單位時間內取樣率越大,精度就越高,處理出來的文件就越接近原始文件,但是文件體積與取樣率是成正比的,所以幾乎所有的編碼格式重視的都是如何用最低的碼率達到最少的失真 【碼率】(kbps)=【文件大小】(MB) /【時間】(秒)
碼率幾點原則:
1、碼率和質量成正比,但是文件體積也和碼率成正比。這是要牢記的。
2、碼率超過一定數值,對圖像的質量沒有多大影響。
3、DVD的容量有限,無論是標準的4.3G,還是超刻,或是D9,都有極限。視頻碼率 計算機中的信息都是二進制的0和1來表示,其中每一個0或1被稱作一個位,用小寫b表示,即bit(位);大寫B表示byte,即字節,一個字節=八個位,即1B=8b;前面的大寫K表示1024的意思,即1024個位(Kb)或1024個字節(KB)。表示文件的大小單位,一般都使用字節(KB)來表示文件的大小。
Kbps:首先要了解的是,ps指的是/s,即每秒。Kbps指的是網絡速度,也就是每秒鐘傳送多少個千位的信息(K表示千位,Kb表示的是多少千個位),為了在直觀上顯得網絡的傳輸速度較快,一般公司都使用kb(千位)來表示。1KB/S=8Kbps。ADSL上網時的網速是512Kbps,如果轉換成字節,就是512/8=64KB/S(即64千字節每秒)。
4、一般來說,如果是1M的寬帶,在網上只能看不超過1024kbps的視頻,超過1024kbps的視頻只能等視頻緩沖才能順利觀看。