user2786
user2786

Reputation: 716

How to add a Fade In and Fade Out effect on Video AVAsset in Swift3 iOS

I am developing a Video application in Swift3. Where I have to convert any text to Video and then have to add a Fade In and Fade Out effect and post the Fade effect Video to server. I don't have to use any Third Party Library for Fade effect.

I can able to convert my Text to a Video, My problem is how can I add Fade In and Fade Out on Video AVAsset.

Can anyone suggest me to achieve this. I cannot find any recent answers to this problem. Thanks for any help!

Upvotes: 2

Views: 1788

Answers (2)

kalpesh
kalpesh

Reputation: 1287

Fade Out effect

let parentLayer = CALayer()
let fadeOut = CABasicAnimation(keyPath: "opacity")
fadeOut.fromValue = 1.0
fadeOut.toValue = 0.0
fadeOut.duration = 5.0//This will video duration
fadeOut.setValue("video", forKey:"fadeOut")
fadeOut.isRemovedOnCompletion = false
fadeOut.fillMode = CAMediaTimingFillMode.forwards
parentLayer.add(fadeOut, forKey: "opacity")

Fade in effect

fadeIn.fromValue = 0.0
fadeIn.toValue = 1.0

Add to your player

self.playerView?.playerLayer?.add(fadeOut, forKey: nil)

Add to your assets

var startTime = CMTime.zero
var timeDuration = CMTimeMake(value: 3, timescale: 1)
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)

//MARK: Fade in effect
layerInstruction.setOpacityRamp(fromStartOpacity: 0.0, toEndOpacity: 1.0, timeRange: CMTimeRange(start: startTime, duration: timeDuration))

//MARK: Fade out effect
        startTime = CMTimeSubtract(mutableComposition.duration, CMTimeMake(value: 3, timescale: 1))
        timeDuration = CMTimeMake(value: 3, timescale: 1)
        layerInstruction.setOpacityRamp(
            fromStartOpacity: 1.0,
            toEndOpacity: 0.0,
            timeRange: CMTimeRangeMake(start: startTime, duration: timeDuration)
        )

Upvotes: 3

Harshal Valanda
Harshal Valanda

Reputation: 5451

AVVideoCompositionLayerInstruction

An array of instances of AVVideoCompositionLayerInstruction that specify how video frames from source tracks should be layered and composed.

AVMutableVideoCompositionInstruction

An AVVideoComposition object maintains an array of instructions to perform its composition.

Example Swift4: I merged videos with fade-in and fade-out effect and change sequence of audio

func doMerge(arrayVideos:[AVAsset], arrayAudios:[AVAsset], animation:Bool, completion:@escaping Completion) -> Void {

    var insertTime = kCMTimeZero
    var audioInsertTime = kCMTimeZero
    var arrayLayerInstructions:[AVMutableVideoCompositionLayerInstruction] = []
    var outputSize = CGSize.init(width: 0, height: 0)

    // Determine video output size
    for videoAsset in arrayVideos {
        let videoTrack = videoAsset.tracks(withMediaType: AVMediaType.video)[0]
        let assetInfo = orientationFromTransform(transform: videoTrack.preferredTransform)
        var videoSize = videoTrack.naturalSize
        if assetInfo.isPortrait == true {
            videoSize.width = videoTrack.naturalSize.height
            videoSize.height = videoTrack.naturalSize.width
        }
        outputSize = videoSize
    }

    // Init composition
    let mixComposition = AVMutableComposition.init()

    for index in 0..<arrayVideos.count {
        // Get video track
        guard let videoTrack = arrayVideos[index].tracks(withMediaType: AVMediaType.video).first else { continue }

        // Get audio track
        var audioTrack:AVAssetTrack?
        if index < arrayAudios.count {
            if arrayAudios[index].tracks(withMediaType: AVMediaType.audio).count > 0 {
                audioTrack = arrayAudios[index].tracks(withMediaType: AVMediaType.audio).first
            }
        }
        // Init video & audio composition track
        let videoCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))

        let audioCompositionTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))

        do {
            let startTime = kCMTimeZero
            let duration = arrayVideos[index].duration

            // Add video track to video composition at specific time
            try videoCompositionTrack?.insertTimeRange(CMTimeRangeMake(startTime, duration), of: videoTrack, at: insertTime)

            // Add audio track to audio composition at specific time
            var audioDuration = kCMTimeZero
            if index < arrayAudios.count   {
                 audioDuration = arrayAudios[index].duration
            }

            if let audioTrack = audioTrack {
                do {
                    try audioCompositionTrack?.insertTimeRange(CMTimeRangeMake(startTime, audioDuration), of: audioTrack, at: audioInsertTime)
                }
                catch {
                    print(error.localizedDescription)
                }
            }

            // Add instruction for video track
            let layerInstruction = videoCompositionInstructionForTrack(track: videoCompositionTrack!, asset: arrayVideos[index], standardSize: outputSize, atTime: insertTime)

            // Hide video track before changing to new track
            let endTime = CMTimeAdd(insertTime, duration)

            if animation {
                let timeScale = arrayVideos[index].duration.timescale
                let durationAnimation = CMTime.init(seconds: 1, preferredTimescale: timeScale)

                layerInstruction.setOpacityRamp (fromStartOpacity: 1.0, toEndOpacity: 0.0, timeRange: CMTimeRange.init(start: endTime, duration: durationAnimation))
            }
            else {
                layerInstruction.setOpacity(0, at: endTime)
            }

            arrayLayerInstructions.append(layerInstruction)

            // Increase the insert time
            audioInsertTime = CMTimeAdd(audioInsertTime, audioDuration)
            insertTime = CMTimeAdd(insertTime, duration)
        }
        catch {
            print("Load track error")
        }
    }

    // Main video composition instruction
    let mainInstruction = AVMutableVideoCompositionInstruction()
    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, insertTime)
    mainInstruction.layerInstructions = arrayLayerInstructions

    // Main video composition
    let mainComposition = AVMutableVideoComposition()
    mainComposition.instructions = [mainInstruction]
    mainComposition.frameDuration = CMTimeMake(1, 30)
    mainComposition.renderSize = outputSize

    // Export to file
    let path = NSTemporaryDirectory().appending("mergedVideo.mp4")
    let exportURL = URL.init(fileURLWithPath: path)

    // Remove file if existed
    FileManager.default.removeItemIfExisted(exportURL)

    // Init exporter
    let exporter = AVAssetExportSession.init(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter?.outputURL = exportURL
    exporter?.outputFileType = AVFileType.mp4
    exporter?.shouldOptimizeForNetworkUse = true
    exporter?.videoComposition = mainComposition

    // Do export
    exporter?.exportAsynchronously(completionHandler: {
        DispatchQueue.main.async {
            self.exportDidFinish(exporter: exporter, videoURL: exportURL, completion: completion)
        }
    })

}

Upvotes: 1

Related Questions