Clip
Clip

Reputation: 3078

Simulate long exposure iOS

I have found several apps in the app store which do a good job of replicating long exposure, I have not been able to find anything in the documentation which caught my eye. One thing I have noticed is that when you open the apps it stops music from playing, which makes me think that these apps are taking a video and somehow combining these images, and overlaying them.

Does anyone know how I can do this using AVFoundation?

Upvotes: 2

Views: 827

Answers (1)

omygaudio
omygaudio

Reputation: 631

Yup. I believe the frames are extracted from the video, processed, then superimposed on top of each other. Typically the image processing part involves dimming the brightness of each frame by a factor of the total number of frames, then accumulating their pixel values.

- (UIImage *) createLongExposure:(NSArray *)images {
   UIImage *firstImg = images[0];
   CGSize imgSize = firstImg.size;
   CGFloat alpha = 1.0 / images.count;

   UIGraphicsBeginImageContext(imgSize);
   CGContextRef context = UIGraphicsGetCurrentContext();
   CGContextSetFillColorWithColor(context, [[UIColor blackColor] CGColor]);
   CGContextFillRect(context, CGRectMake(0, 0, imgSize.width, imgSize.height));

   for (UIImage *image in images) {
      [image drawInRect:CGRectMake(0, 0, imgSize.width, imgSize.height)
        blendMode:kCGBlendModePlusLighter alpha:alpha];
   }
   UIImage *longExpImg = UIGraphicsGetImageFromCurrentImageContext();
   UIGraphicsEndImageContext();
   return longExpImg;
}

Sample code for capturing video frames. https://developer.apple.com/library/ios/qa/qa1702/_index.html

Upvotes: 5

Related Questions