Andy Hin
Andy Hin

Reputation: 31893

AVFoundation - Reverse an AVAsset and output video file

I've seen this question asked a few times, but none of them seem to have any working answers.

The requirement is to reverse and output a video file (not just play it in reverse) keeping the same compression, format, and frame rate as the source video.

Ideally, the solution would be able to do this all in memory or buffer and avoid generating the frames into image files (for ex: using AVAssetImageGenerator) and then recompiling it (resource intensive, unreliable timing results, changes in frame/image quality from original, etc.).

--

My contribution: This is still not working, but the best I've tried so far:

Upvotes: 25

Views: 8407

Answers (2)

Vikas Saini
Vikas Saini

Reputation: 581

Swift 5 version of Original Answer:

extension AVAsset {
    func getReversedAsset(outputURL: URL) -> AVAsset? {
        do {
            let reader = try AVAssetReader(asset: self)

            guard let videoTrack = tracks(withMediaType: AVMediaType.video).last else {
                return .none
            }

            let readerOutputSettings = [
                "\(kCVPixelBufferPixelFormatTypeKey)": Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
            ]

            let readerOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: readerOutputSettings)

            reader.add(readerOutput)
            reader.startReading()

            // Read in frames (CMSampleBuffer is a frame)
            var samples = [CMSampleBuffer]()
            while let sample = readerOutput.copyNextSampleBuffer() {
                samples.append(sample)
            }

            // Write to AVAsset
            let writer = try AVAssetWriter(outputURL: outputURL, fileType: AVFileType.mp4)

            let writerOutputSettings = [
                AVVideoCodecKey: AVVideoCodecType.h264,
                AVVideoWidthKey: videoTrack.naturalSize.width,
                AVVideoHeightKey: videoTrack.naturalSize.height,
                AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey: videoTrack.estimatedDataRate]
            ] as [String : Any]
            
            let sourceFormatHint = videoTrack.formatDescriptions.last as! CMFormatDescription
            let writerInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: writerOutputSettings, sourceFormatHint: sourceFormatHint)
            writerInput.expectsMediaDataInRealTime = false

            let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: writerInput, sourcePixelBufferAttributes: .none)
            writer.add(writerInput)
            writer.startWriting()
            writer.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(samples[0]))

            for (index, sample) in samples.enumerated() {
                let presentationTime = CMSampleBufferGetPresentationTimeStamp(sample)

                if let imageBufferRef = CMSampleBufferGetImageBuffer(samples[samples.count - index - 1]) {
                    pixelBufferAdaptor.append(imageBufferRef, withPresentationTime: presentationTime)
                }

                while !writerInput.isReadyForMoreMediaData {
                    Thread.sleep(forTimeInterval: 0.1)
                }
            }

            writer.finishWriting { }
            return AVAsset(url: outputURL)
        }
        catch let error as NSError {
            print("\(error)")
            return .none
        }
    }
}

Upvotes: 1

Andy Hin
Andy Hin

Reputation: 31893

Worked on this over the last few days and was able to get it working.

Source code here: http://www.andyhin.com/post/5/reverse-video-avfoundation

Uses AVAssetReader to read out the samples/frames, extracts the image/pixel buffer, and then appends it with the presentation time of the mirror frame.

Upvotes: 17

Related Questions