iOS-音頻合并,音視頻合并,剪切

iOS中音頻合并是指將兩個不同的聲音文件合成一個聲音文件進行輸出,音視頻合并適用于視頻中沒有聲音,將聲音合并到視頻中,最好聲音的時長和視頻的時長是一致的,生成新的適配會更有效果一點,剪切是合并的逆向操作,對音視頻可以進行更精細化的處理~

CMTime簡介

音視頻合并剪切需要更精確,所有中間時間的都是通過CMTime進行出來的,先來看一下結構:

typedef struct
{
 CMTimeValue value;  /*! @field value The value of the CMTime. value/timescale = seconds. */
 CMTimeScale timescale; /*! @field timescale The timescale of the CMTime. value/timescale = seconds.  */
 CMTimeFlags flags;  /*! @field flags The flags, eg. kCMTimeFlags_Valid, kCMTimeFlags_PositiveInfinity, etc. */
 CMTimeEpoch epoch;  /*! @field epoch Differentiates between equal timestamps that are actually different because
             of looping, multi-item sequencing, etc.  
             Will be used during comparison: greater epochs happen after lesser ones. 
             Additions/subtraction is only possible within a single epoch,
             however, since epoch length may be unknown/variable. */
} CMTime;

value是分子,timeScale是分母,下面是幾種CMTime常用的方式,CMTimeShow進行輸出,代碼如下:

    CMTime startTime = CMTimeMake(13, 100);
    CMTimeShow(startTime);
    
    CMTime endTime = CMTimeMake(40, 100);
    CMTimeShow(endTime);
    
    CMTime addTime = CMTimeAdd(startTime, endTime);
    CMTimeShow(addTime);
    
    CMTime subTime = CMTimeSubtract(startTime,endTime);
    CMTimeShow(subTime);
    
    CMTimeRange timeRange = CMTimeRangeMake(startTime, endTime);
    CMTimeRangeShow(timeRange);
    
    CMTimeRange fromRange = CMTimeRangeFromTimeToTime(startTime, endTime);
    CMTimeRangeShow(fromRange);
    
    //交集
    CMTimeRange intersectionRange = CMTimeRangeGetIntersection(timeRange, fromRange);
    CMTimeRangeShow(intersectionRange);
    //合集
    CMTimeRange unionRange = CMTimeRangeGetUnion(timeRange, fromRange);
    CMTimeRangeShow(unionRange);

音頻合并

音視頻合并只需要AVFoundation頭文件即可進行文件操作,不過有一個限制就是導出的文件都是m4a格式的:
音頻合并操作:

    NSString *wayPath = [[NSBundle mainBundle] pathForResource:@"MyWay" ofType:@"mp3"];
    NSString *easyPath = [[NSBundle mainBundle] pathForResource:@"Easy" ofType:@"mp3"];
    NSMutableArray *dataArr = [NSMutableArray array];
    [dataArr addObject:[NSURL fileURLWithPath:wayPath]];
    [dataArr addObject:[NSURL fileURLWithPath:easyPath]];
    NSString *destPath = [[self composeDir] stringByAppendingString:@"FlyElephant.m4a"];
    
    if ([[NSFileManager defaultManager] fileExistsAtPath:destPath]) {
        [[NSFileManager defaultManager] removeItemAtPath:destPath error:nil];
    }
    [self audioMerge:dataArr destUrl:[NSURL fileURLWithPath:destPath]];

核心代碼:

- (void)audioMerge:(NSMutableArray *)dataSource destUrl:(NSURL *)destUrl{
    AVMutableComposition *mixComposition = [AVMutableComposition composition];
    
    // 開始時間
    CMTime beginTime = kCMTimeZero;
    // 設置音頻合并音軌
    AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    
    NSError *error = nil;
    for (NSURL *sourceURL in dataSource) {
        //音頻文件資源
        AVURLAsset  *audioAsset = [[AVURLAsset alloc] initWithURL:sourceURL options:nil];
        //需要合并的音頻文件的區間
        CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
        // ofTrack 音頻文件內容
        BOOL success = [compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:beginTime error:&error];
        
        if (!success) {
            NSLog(@"Error: %@",error);
        }
        beginTime = CMTimeAdd(beginTime, audioAsset.duration);
    }
    // presetName 與 outputFileType 要對應  導出合并的音頻
    AVAssetExportSession *assetExportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetAppleM4A];
    assetExportSession.outputURL = destUrl;
    assetExportSession.outputFileType = @"com.apple.m4a-audio";
    assetExportSession.shouldOptimizeForNetworkUse = YES;
    [assetExportSession exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            NSLog(@"%@",assetExportSession.error);
        });
    }];
}

音視頻合并

音視頻和音頻合并類似,合并操作:

    NSString *wayPath = [[NSBundle mainBundle] pathForResource:@"MyWay" ofType:@"mp3"];
    NSString *easyPath = [[NSBundle mainBundle] pathForResource:@"Way" ofType:@"mp4"];
    
    NSString *destPath = [[self composeDir] stringByAppendingString:@"FlyElephant.mp4"];
    
    if ([[NSFileManager defaultManager] fileExistsAtPath:destPath]) {
        [[NSFileManager defaultManager] removeItemAtPath:destPath error:nil];
    }
    [self audioVedioMerge:[NSURL fileURLWithPath:wayPath] vedioUrl:[NSURL fileURLWithPath:easyPath] destUrl:[NSURL fileURLWithPath:destPath]];

核心代碼:

- (void)audioVedioMerge:(NSURL *)audioUrl vedioUrl:(NSURL *)vedioUrl destUrl:(NSURL *)destUrl {
    AVMutableComposition *mixComposition = [AVMutableComposition composition];
    NSError *error;
    
    AVMutableCompositionTrack *audioCompostionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    //音頻文件資源
    AVURLAsset  *audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:nil];
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    [audioCompostionTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:kCMTimeZero error:&error];
    
    //視頻文件資源
    AVMutableCompositionTrack *vedioCompostionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVURLAsset *vedioAsset = [[AVURLAsset alloc] initWithURL:vedioUrl options:nil];
    CMTimeRange vedio_timeRange = CMTimeRangeMake(kCMTimeZero, vedioAsset.duration);
    [vedioCompostionTrack insertTimeRange:vedio_timeRange ofTrack:[[vedioAsset tracksWithMediaType:AVMediaTypeVideo] firstObject] atTime:kCMTimeZero error:&error];
    
    // presetName 與 outputFileType 要對應  導出合并的音頻
    AVAssetExportSession* assetExportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
    assetExportSession.outputURL = destUrl;
    assetExportSession.outputFileType = @"com.apple.quicktime-movie";
    assetExportSession.shouldOptimizeForNetworkUse = YES;
    [assetExportSession exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            NSLog(@"%@",assetExportSession.error);
        });
    }];
}

音頻剪切

剪切比合并簡單,只需要操作一個文件即可:

    NSString *inputPath = [[self composeDir] stringByAppendingString:@"FlyElephant.m4a"];
    
    [self audioCrop:[NSURL fileURLWithPath:inputPath] startTime:CMTimeMake(30, 1) endTime:CMTimeMake(48, 1)];

核心代碼:

- (void)audioCrop:(NSURL *)url startTime:(CMTime)startTime endTime:(CMTime)endTime {
    
    NSString *outPutPath = [[self composeDir] stringByAppendingPathComponent:@"Crop.m4a"];
    NSURL *audioFileOutput = [NSURL fileURLWithPath:outPutPath];
    
    [[NSFileManager defaultManager] removeItemAtURL:audioFileOutput error:NULL];
    AVAsset *asset = [AVAsset assetWithURL:url];
    
    AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:asset
                                                                            presetName:AVAssetExportPresetAppleM4A];
    CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, endTime);
    
    exportSession.outputURL = audioFileOutput;
    exportSession.outputFileType = AVFileTypeAppleM4A;
    exportSession.timeRange = exportTimeRange;
    
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        if (AVAssetExportSessionStatusCompleted == exportSession.status) {
            NSLog(@" FlyElephant \n %@", outPutPath);
        } else if (AVAssetExportSessionStatusFailed == exportSession.status) {
            NSLog(@"FlyElephant error: %@", exportSession.error.localizedDescription);
        }
    }];
}

文件路徑:

- (NSString *)composeDir {
    NSString *cacheDir = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) firstObject];
    NSFileManager *fileManager  = [NSFileManager defaultManager];
    NSString *compseDir = [NSString stringWithFormat:@"%@/AudioCompose/", cacheDir];
    BOOL isDir = NO;
    BOOL existed = [fileManager fileExistsAtPath:compseDir isDirectory:&isDir];
    if ( !(isDir == YES && existed == YES) ) {
        [fileManager createDirectoryAtPath:compseDir withIntermediateDirectories:YES attributes:nil error:nil];
    }
    return compseDir;
}

以上就是音頻合并,剪切的基本操作,如有疑問,歡迎評論區探討~

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。

推薦閱讀更多精彩內容