Reputation: 5448
I am working on a utility for cropping UIImage
s according to an arbitrary aspect ratio with either aspectFit
or aspectFill
as a crop option. The aspectFill
option will crop the original image in such a way that the final image will be full covered by the original image. aspectFit
will make sure that no pixel of the original image is cut, and black stripes will be added to the sides of the original image to make it fit to the aspect ratio. I know that there already are 3rd party libraries which do the same job, but I wanted to make this myself as a learning exercise.
aspectFill
, I am simply calculating the final image offsets and size and cropping the CGRect
from the original UIImage
.aspectFit
, I calculate the final image size and then make a CGContextRef
which is filled using CGContextFillRect
. To that context, I draw the original image at the required offsets (keeping the original image in the middle of the final image).To test this utility, I am using a 2MB image which is approximately the same size as the iPhone camera photos. The following problems cropped up:
The utility for aspectFill
is working as expected, with each image taking around 0.01ms processing time, which is great. The problem is that if I try to run this utility in a loop for a lot of images (10000+) the memory usage spikes until the app crashes. I added the @autoreleasepool
block, but it seems to be making no difference.
aspectFit
utility has the opposite problem. The @autoreleasepool
block here works as expected and releases the objects periodically so the app does not crash for any number of images in a loop. But here, each image takes processing time around 130ms, which seems to be a lot. I tried using UIGraphicsBeginImageContext
instead of CGContextRef
, but that was taking even more time.
Code for aspectFill
//ox, oy are the image crop offsets calculated before and fw, fh are the width and height of final image
@autoreleasepool {
CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], CGRectMake(ox,oy,fw,fh));
UIImage *finalImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return finalImage;
}
Code for aspectFit
@autoreleasepool {
CGColorSpaceRef colorSpace = CGImageGetColorSpace(image.CGImage);
CGContextRef context = CGBitmapContextCreate(NULL, fw, fh, CGImageGetBitsPerComponent(image.CGImage), 0, colorSpace, CGImageGetBitmapInfo(image.CGImage));
CGContextSetFillColorWithColor(context, [UIColor blackColor].CGColor);
CGContextFillRect(context, CGRectMake(0, 0, fw, fh));
CGRect rect = CGRectMake(ox, oy, image.size.width, image.size.height);
CGContextDrawImage(context, rect, image.CGImage);
CGImageRef newCGImage = CGBitmapContextCreateImage(context);
CGContextRelease(context);
UIImage *finalImage = [UIImage imageWithCGImage:newCGImage];
CGImageRelease(newCGImage);
return finalImage;
}
Can someone point to what I am doing wrong, or give me some optimisations for reducing the processing time for this utility? Thanks in advance!
Upvotes: 1
Views: 94
Reputation: 5448
Finally, after a lot of messing about with CGGraphics
and UIGraphics
stuff, I found out that the issue was not with the utility at all. The code above was perfect, but the culprit was the way I was loading the UIImage
.
I was using
[UIImage imageNamed:@"twomb.jpg"];
to load the image. The imageNamed
method caches the image in memory, all 2MB of it. That was eating up the time. On changing the above line to
[UIImage imageWithContentsOfFile:@"twomb.jpg"];
the time and memory use both reduced drastically.
Upvotes: 2