Achim
Achim

Reputation: 13

Realtime access to iPhone's camera images

I'm trying to read an (average) RGB-value of the center pixel(s) of the iPhone camera. This should almost happen in realtime. Therefore I open a UIImagePickerController, use a timer to take a picture every x seconds. Processing the picture is made in a separate thread such that it does not block the app while computing the RGB-value. I tried several ways to access the RGB/pixel values of the taken image, but all have the problem that they are too slow and cause the camera view to lag. I tried the following:

- (UIColor *)getAverageColorOfImage:(UIImage*)image {
int pixelCount = kDetectorSize * kDetectorSize;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * kDetectorSize;
NSUInteger bitsPerComponent = 8;
unsigned char *rawData = malloc(pixelCount * bytesPerPixel);
CGContextRef context = CGBitmapContextCreate(rawData, kDetectorSize, kDetectorSize, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextSetInterpolationQuality(context, kCGInterpolationNone);

NSLog(@"Drawing image");
CGContextDrawImage(context, CGRectMake(0, 0, kDetectorSize, kDetectorSize), [image CGImage]);
NSLog(@"Image drawn");

CGContextRelease(context);

// rawData contains the image data in the RGBA8888 pixel format. Alpha values are ignored
int byteIndex = 0;
CGFloat red = 0.0;
CGFloat green = 0.0;
CGFloat blue = 0.0;

for (int i = 0 ; i <  pixelCount; ++i) {
    red   += rawData[byteIndex];
    green += rawData[byteIndex + 1];
    blue  += rawData[byteIndex + 2];
    byteIndex += bytesPerPixel;
}

free(rawData);

return [UIColor colorWithRed:red/pixelCount/255.0 green:green/pixelCount/255.0 blue:blue/pixelCount/255.0 alpha:1.0];
}

kDetectorSize is set to 6 such that the processed image has a size of 6x6 pixels. The one of the image-parameter also has been cropped to 6x6 pixels before. The slow part is CGContextDrawImage which takes about 500-600ms on my iPhone 4. I tried some alternatives for that line:

UIGraphicsPushContext(context); 
[image drawAtPoint:CGPointMake(0.0, 0.0)];
UIGraphicsPopContext();

or

UIGraphicsPushContext(context); 
[image drawInRect:CGRectMake(0.0, 0.0, kDetectorSize, kDetectorSize)];
UIGraphicsPopContext();

Both approaches are as slow as the one above. The image size does not have a significant influence (I'd say it has no influence). Does anyone know a faster way to access the RGB value?

It would be ok, too, if the thread would not cause the camera view to lag. I call my thread like that:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
    UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
    [NSThread detachNewThreadSelector:@selector(pickColorFromImage:) toTarget:self withObject:image]; 
}

- (void)pickColorFromImage:(UIImage *)image {
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];   
    [NSThread setThreadPriority:0.0];

    [...cropping the image...]
UIColor *averageColor = [self getAverageColorOfImage:croppedImage];

    [self performSelectorOnMainThread:@selector(applyPickedColor:) withObject:averageColor waitUntilDone:NO];  

    [pool release];
}

Thanks for your help!

Upvotes: 1

Views: 1317

Answers (1)

lxt
lxt

Reputation: 31304

You're approaching this in the wrong way - Apple offer a class to do exactly what you want without messing around with timers and UIImagePickers. AVCaptureSession and related classes give you realtime access to raw pixel data from the camera(s).

For more information refer to the documentation:

http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVCaptureSession_Class/Reference/Reference.html%23//apple_ref/doc/uid/TP40009521

Upvotes: 4

Related Questions