user7404038
user7404038

Reputation:

ScaleTimeRange of only a part of video

The scaleTimeRange(timeRange: CMTimeRange, toDuration duration: CMTime) method works very well if one wants to apply Slow motion effect to video.

But I noticed that it only works if applied to the entire video's duration. If an arbitrary timeRange, e.g. a CMTimeRangeMake(_ start: 2, duration: 3) is passed , the method doesn't seem to work at all. i.e. When the mp4 video is exported it doesn't have the desired slow motion effect (from 0:00:02 - 0:00:05)

Q 1) Is there a way to apply this scaleTimeRange method to only a part of the video? If so, how can it be done?

Q2)If not , how can this slow motion effect be applied to only a part of the video? Is there any other way?

CODE :

var asset: AVAsset?


func  setupAsset(){

let videoAsset = AVURLAsset(url: Bundle.main.url(forResource: "Sample", withExtension: "mp4")!)

let comp = AVMutableComposition()

let videoAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo).first! as AVAssetTrack


let videoCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)


do {

    try videoCompositionTrack.insertTimeRange(
        CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9 , 600)),
        of: videoAssetSourceTrack,
        at: kCMTimeZero)


    let videoScaleFactor = Int64(3.0)
    let videoDuration: CMTime = videoAsset.duration



    let tstStartTime = CMTime(value: 2, timescale: videoDuration.timescale)
    let tstDuration =  CMTime(value: 1 , timescale: videoDuration.timescale)

    //1.  Applies slow motion correctly (to entire video)

    videoCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero , videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale))

    //2. Replace with 1 , the exported video plays as is with no slow motion effect

    videoCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero , tstDuration), toDuration: CMTimeMake(tstDuration.value * videoScaleFactor, videoDuration.timescale))

    // 3. Replace with 1, unexpected behaviour : video only displays first frame for CMTimeMakes's value then proceeds to play video normally.
    videoCompositionTrack.scaleTimeRange(CMTimeRangeMake(tstStartTime , tstDuration), toDuration: CMTimeMake(tstDuration.value * videoScaleFactor, videoDuration.timescale))



     videoCompositionTrack.preferredTransform = videoAssetSourceTrack.preferredTransform



}catch { print(error) }

asset = comp
}

Upvotes: 3

Views: 1547

Answers (1)

Dave Weston
Dave Weston

Reputation: 6635

My guess is that it's working "correctly", but the portion of the video that you are slowing down is much, much smaller than you expect.

CMTime is a very unusual data structure, so it can be very confusing to wrap your head around it. What is the value of videoDuration.timescale that you are using to construct the tstStartTime and tstDuration variables? The larger that timescale value is, the smaller the portion of time that is represented by the CMTime value.

For example, if the timescale is 4, then CMTime(value: 2, timescale: 4) represents 2/4 seconds, or half a second.

For more information, see the documentation for CMTime: https://developer.apple.com/reference/coremedia/1669288-cmtime

Upvotes: 1

Related Questions