Becky Hansmeyer
Becky Hansmeyer

Reputation: 712

AVMutableAudioMix multiple volume changes to single track

I'm working on an app that merges multiple video clips into one final video. I would like to give users the ability to mute individual clips if desired (so, only parts of the final merged video would be muted). I have wrapped the AVAssets in a class called "Video" that has a "shouldMute" property.

My problem is, when I set the volume of one of the AVAssetTracks to zero, it stays muted for the remainder of the final video. Here is my code:

    var completeDuration : CMTime = CMTimeMake(0, 1)
    var insertTime = kCMTimeZero
    var layerInstructions = [AVVideoCompositionLayerInstruction]()
    let mixComposition = AVMutableComposition()
    let audioMix = AVMutableAudioMix()

    let videoTrack =
        mixComposition.addMutableTrack(withMediaType: AVMediaType.video,
                                       preferredTrackID: kCMPersistentTrackID_Invalid)
    let audioTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)


    // iterate through video assets and merge together
    for (i, video) in clips.enumerated() {

        let videoAsset = video.asset
        var clipDuration = videoAsset.duration

        do {
            if video == clips.first {
                insertTime = kCMTimeZero
            } else {
                insertTime = completeDuration
            }


            if let videoAssetTrack = videoAsset.tracks(withMediaType: .video).first {
                try videoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipDuration), of: videoAssetTrack, at: insertTime)
                completeDuration = CMTimeAdd(completeDuration, clipDuration)
            }

            if let audioAssetTrack = videoAsset.tracks(withMediaType: .audio).first {
                try audioTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipDuration), of: audioAssetTrack, at: insertTime)

                if video.shouldMute {
                    let audioMixInputParams = AVMutableAudioMixInputParameters()
                    audioMixInputParams.trackID = audioTrack!.trackID
                    audioMixInputParams.setVolume(0.0, at: insertTime)
                    audioMix.inputParameters.append(audioMixInputParams)
                }
            }

        } catch let error as NSError {
            print("error: \(error)")
        }

        let videoInstruction = videoCompositionInstructionForTrack(track: videoTrack!, video: video)
        if video != clips.last{
            videoInstruction.setOpacity(0.0, at: completeDuration)
        }

        layerInstructions.append(videoInstruction)
        } // end of video asset iteration

If I add another setVolume:atTime instruction to increase the volume back to 1.0 at the end of the clip, then the first volume instruction is completely ignored and the whole video plays at full volume.

In other words, this isn't working:

if video.shouldMute {
                    let audioMixInputParams = AVMutableAudioMixInputParameters()
                    audioMixInputParams.trackID = audioTrack!.trackID
                    audioMixInputParams.setVolume(0.0, at: insertTime)
                    audioMixInputParams.setVolume(1.0, at: completeDuration)
                    audioMix.inputParameters.append(audioMixInputParams)
                }

I have set the audioMix on both my AVPlayerItem and AVAssetExportSession. What am I doing wrong? What can I do to allow users to mute the time ranges of individual clips before merging into the final video?

Upvotes: 2

Views: 1102

Answers (1)

Becky Hansmeyer
Becky Hansmeyer

Reputation: 712

Apparently I was going about this wrong. As you can see above, my composition has two AVMutableCompositionTracks: a video track, and an audio track. Even though I inserted the time ranges of a series of other tracks into those two tracks, there's still ultimately only two tracks. So, I only needed one AVMutableAudioMixInputParameters object to associate with my one audio track.

I initialized a single AVMutableAudioMixInputParameters object and then, after I inserted the time range of each clip, I'd check to see whether it should be muted and set a volume ramp for the clip's time range (the time range in relation to the entire audio track). Here's what that looks like, inside my clip iteration:

if let audioAssetTrack = videoAsset.tracks(withMediaType: .audio).first {
                try audioTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipDuration), of: audioAssetTrack, at: insertTime)

                if video.shouldMute {
                    audioMixInputParams.setVolumeRamp(fromStartVolume: 0.0, toEndVolume: 0.0, timeRange: CMTimeRangeMake(insertTime, clipDuration))
                } else {
                    audioMixInputParams.setVolumeRamp(fromStartVolume: 1.0, toEndVolume: 1.0, timeRange: CMTimeRangeMake(insertTime, clipDuration))
                }
            }

Upvotes: 5

Related Questions