Reputation: 47328
I'm looking for a way to create long time lapse videos on an iPhone running iOS 9, and hoping to get some pointers on how to start. Ideally I would compress 1 hour of footage into 1 minute, so the scaling factor is 60. I take one frame out of 60 and stitch them together, right?
I have a project which uses AVFoundation capture images using captureOutput:idOutputSampleBuffer:fromConnection:
However, I'm not sure if there are better approaches to creating a time lapse over several hours.
Would it make sense to take individual photos and stitch them together (activating camera every few seconds)?
Or just take frames out of CMSampleBufferRef?
Are there other APIs I can use for capturing camera images?
I'm hoping to understand which approach would result in the highest quality and battery life.
I'm looking at this question which appears to have code for stitching images, but I'm not sure if I need anything else for my project.
Upvotes: 3
Views: 1705
Reputation: 4325
If you consider how dslr captures it you will get the hold of it.
The camera basically clicks 1 picture after every n seconds.
Let's say you set the value to 60. This will result into 1 click every minute. You leave the camera for 8 hours -> 480 minutes -> 480 pictures.
Now its time to stitch these frames together. Lets say you add them with 10 fps , meaning 10 pics in 1 second. This will result in 48 seconds of total footage. I wrote a short piece on this. If needed I can provide the link.
Upvotes: 0
Reputation: 517
One way to accomplish timelapse would be , instead of using AVCaptureVideoDataOutput
to process video frames , you can use AVCapturePhotoOutput
to get the images from the sample buffers.
A timer is then set to Capture the sampleBuffer every second or so , and finally you stitch the frames together with AVAssetWriter
to have the video.
Checkout Apple's StopNGo sample app.
Upvotes: 1