Sam Bing
Sam Bing

Reputation: 2872

Recording video using AVFoundation Swift

I created a AVCaptureSession and manipulate each frame (morphing) the users face,adding layers etc. How can i turn those frames into a video that can be saved to the camera roll?

Here's how i set up AVCaptureSession

func setupCapture() {

    let session : AVCaptureSession = AVCaptureSession()
    session.sessionPreset = AVCaptureSessionPreset640x480


    let device : AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    let deviceInput : AVCaptureDeviceInput = try! AVCaptureDeviceInput(device: device)


    if session.canAddInput(deviceInput) {
        session.addInput(deviceInput)
    }

    stillImageOutput = AVCaptureStillImageOutput()


    videoDataOutput = AVCaptureVideoDataOutput()

    let rgbOutputSettings = [kCVPixelBufferPixelFormatTypeKey as String: NSNumber(unsignedInt: kCMPixelFormat_32BGRA)]

    videoDataOutput.videoSettings = rgbOutputSettings
    videoDataOutput.alwaysDiscardsLateVideoFrames = true

    videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL)
    videoDataOutput.setSampleBufferDelegate(self, queue: videoDataOutputQueue)

    if session.canAddOutput(videoDataOutput) {
        session.addOutput(videoDataOutput)
    }
    videoDataOutput.connectionWithMediaType(AVMediaTypeVideo).enabled = false

    effectiveScale = 1.0

    previewLayer = AVCaptureVideoPreviewLayer(session: session)
    previewLayer.backgroundColor = UIColor.blackColor().CGColor
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspect
    let rootLayer : CALayer = previewView.layer
    rootLayer.masksToBounds = true
    previewLayer.frame = rootLayer.bounds
    rootLayer.addSublayer(previewLayer)
    session.startRunning()
}

Than i use the CMSampleBuffer to get a CIImage which i add my effects to.

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

    let pixelBuffer : CVPixelBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)!
    let attachments : CFDictionaryRef = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, pixelBuffer, CMAttachmentMode( kCMAttachmentMode_ShouldPropagate))!

    let ciImage : CIImage = CIImage(CVPixelBuffer: pixelBuffer, options: attachments as? [String : AnyObject])

How can i record video doing this?

Upvotes: 2

Views: 1735

Answers (1)

Alex Echevarria
Alex Echevarria

Reputation: 21

I found a solution to your problem. I too am working in Swift and had the same question. I'm thankful you posted this question because it helped me a lot. I was able to successfully process frames and then write them to the camera roll. I have not perfected it, but it is possible.

It turns out once you start messing around with AVCaptureDataOutput you allegedly lose the ability to use AVCaptureMovieFileOutput see discussion here.

A concise solution to your problem can be found here. This is the version I implemented. Feed the sampleBuffer to the AVAssetWriterInput object.

A more verbose guide to solving the problem describes exactly what parts (i.e. AVAssetWriter, AVAssetWriterInput and outSettings etc..) are needed and what they look like.

The links I posted are in obj-c but can be translated into swift. I hope you have already solved your problem.

Upvotes: 2

Related Questions