Reputation: 22549
I want to take allow the user to take a picture and then show the greyscale version. However, it is very slow because the image file is too big/resolution is too high.
How can I reduce the quality of the image when the user takes the picture?
Heres the code I am using for the transformation:
- (UIImage *)convertImageToGrayScale:(UIImage *)image
{
// Create image rectangle with current image width/height
CGRect imageRect = CGRectMake(0, 0, image.size.width, image.size.height);
// Grayscale color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
// Create bitmap content with current image size and grayscale colorspace
CGContextRef context = CGBitmapContextCreate(nil, image.size.width, image.size.height, 8, 0, colorSpace, kCGImageAlphaNone);
// Draw image into current context, with specified rectangle
// using previously defined context (with grayscale colorspace)
CGContextDrawImage(context, imageRect, [image CGImage]);
/* changes start here */
// Create bitmap image info from pixel data in current context
CGImageRef grayImage = CGBitmapContextCreateImage(context);
// release the colorspace and graphics context
CGColorSpaceRelease(colorSpace);
CGContextRelease(context);
// make a new alpha-only graphics context
context = CGBitmapContextCreate(nil, image.size.width, image.size.height, 8, 0, nil, kCGImageAlphaOnly);
// draw image into context with no colorspace
CGContextDrawImage(context, imageRect, [image CGImage]);
// create alpha bitmap mask from current context
CGImageRef mask = CGBitmapContextCreateImage(context);
// release graphics context
CGContextRelease(context);
// make UIImage from grayscale image with alpha mask
UIImage *grayScaleImage = [UIImage imageWithCGImage:CGImageCreateWithMask(grayImage, mask) scale:image.scale orientation:image.imageOrientation];
// release the CG images
CGImageRelease(grayImage);
CGImageRelease(mask);
// return the new grayscale image
return grayScaleImage;
/* changes end here */
}
Upvotes: 3
Views: 1885
Reputation: 14314
How about downsampling the UIImage before passing it on to the grayscale translation? Something like:
NSData *imageAsData = UIImageJPEGRepresentation(imageFromCamera, 0.5);
UIImage *downsampledImaged = [UIImage imageWithData:imageAsData];
You could use other compression qualities other than 0.5 of course.
Upvotes: 2
Reputation: 5589
If you are using AVFoundation to capture the image you can set the quality of the image to be captured by changing the capture session preset like the following:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetLow;
There is a table of which presents correspond to which resolution in the AVFoundation Programming Guide.
Upvotes: 2