Jad
Jad

Reputation: 188

Stocking UIImages then convert to movie

So i've been working on a video capture project , that allows users to capture images and videos and apply filters . I'am using AVfoundation framework , I succeeded in capturing still images , and capturing Video Frames as UIImage Objects... the only thing left is to record a video.

here's my code:

- (void)initCapture {

    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPresetMedium;



    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input) {
        // Handle the error appropriately.
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];




    stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [stillImageOutput setOutputSettings:outputSettings];
    [session addOutput:stillImageOutput];


    captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    captureOutput.alwaysDiscardsLateVideoFrames = YES; 


    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    [captureOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 

    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 

    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
    [captureOutput setVideoSettings:videoSettings];    

    [session addOutput:captureOutput]; 

    [session startRunning];    
}




- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection 
{ 
    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    CVPixelBufferLockBaseAddress(imageBuffer,0); 
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer);  
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 


    CGContextRelease(newContext); 
    CGColorSpaceRelease(colorSpace);


    UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];

    CGImageRelease(newImage);

    UIImage *ima = [filter applyFilter:image];

    /*if(isRecording == YES)
    {
        [imageArray addObject:ima];  
    }
     NSLog(@"Count= %d",imageArray.count);*/

    [self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:ima waitUntilDone:YES];


    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

   [pool drain];

} 

I tried stocking the UIImages in a mutable array but that was a stupid idea. Any thoughts? Any help will be appreciated

Upvotes: 1

Views: 415

Answers (1)

SushiGrass Jacob
SushiGrass Jacob

Reputation: 19814

Are you using CIFilter? If not, maybe you should be looking to it for speedy, GPU-based conversion.

You might want to record the corresponding image to AVAssetWriter directly after generating it. Look to the RosyWriter sample code provided by Apple for a direction as to they do it. In summary, they utilize AVAssetWriter to capture the frames to a temporary file and then, when finished, store that file to the camera.

One warning, though, is that RosyWriter was getting 4fps on my 4th gen iPod touch. They are doing a brute-force altering of the pixels on the CPU. Core Image does GPU-based filter and I was able to achieve 12fps which, in my opinion, is still not what it should be.

Good luck!

Upvotes: 1

Related Questions