Reputation: 10961
I've got a UICollectionView
in my app whose cells mainly consist of UIImageView
s containing images that have been manipulated with Core Image to have less color saturation. Performance is absolutely horrible when scrolling. When I profile it, the huge majority of time spent (80% or so) is not in my code or even in my stack. It all appears to be in Core Animation code. Might anyone know why this could be?
In my UICollectionViewCell
subclass I have something like this:
UIImage *img = [self obtainImageForCell];
img = [img applySaturation:0.5];
self.imageView.image = img;
applySaturation
looks like this:
CIImage *image = [CIImage imageWithCGImage:self.CGImage];
CIFilter *filter = [CIFilter filterWithName:@"CIColorControls"];
[filter setValue:image forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:saturation] forKey:@"inputSaturation"];
return [UIImage imageWithCIImage:filter.outputImage];
My only guess is that Core Animation doesn't play well with Core Image. The Apple docs say this about CIImage
:
Although a CIImage object has image data associated with it, it is not an image. You can think of a CIImage object as an image “recipe.” A CIImage object has all the information necessary to produce an image, but Core Image doesn’t actually render an image until it is told to do so. This “lazy evaluation” method allows Core Image to operate as efficiently as possible.
Doing this evaluation at the last minute while animating might be tricky.
Upvotes: 1
Views: 718
Reputation: 1316
I also had this exact problem – right down to wanting to desaturate an image! – and filtering once and caching the result (even as a UIImage) didn't help.
The problem, as others have mentioned, is that a CIImage
encapsulates the information required to generate an image, but isn't actually an image itself. So when scrolling, the on-screen image needs to be generated on the fly, which kills performance. The same turns out to be true of a UIImage
created using the imageWithCIImage:scale:orientation:
method, so creating this once and reusing it also doesn't help.
The solution is to force CoreGraphics to actually render the image before saving it as a UIImage. This gave a huge improvement in scrolling performance for me. In your case, the applySaturation
method might look like this:
CIImage *image = [CIImage imageWithCGImage:self.CGImage];
CIFilter *filter = [CIFilter filterWithName:@"CIColorControls"];
[filter setValue:image forKey:kCIInputImageKey];
[filter setValue:[NSNumber numberWithFloat:saturation] forKey:@"inputSaturation"];
CGImageRef cgImage = [[CIContext contextWithOptions:nil] createCGImage:filter.outputImage fromRect:filter.outputImage.extent];
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
return image;
You might also consider caching the CIFilter and/or the CIContext if you're going to be using this method a lot, since these can be expensive to create.
Upvotes: 0
Reputation: 31745
I had the exact same problem, cured by avoiding the triggering of Core Image filters during cell updates.
The Apple docs stuff about lazy evaluation / recipes is, I think, more directed at the idea that you can chain core image filters together very efficiently. However, when you want to display the results of a core image filter chain, the thing needs to be evaluated then and there, which is not a good situation if the 'then and there' is during a rapidly-scrolling view and the filter in question requires heavy processing (many of them do).
You can try fiddling with GPU vs CPU processing, but I have found that the overhead of moving image data to and from CIImage can be a more significant overhead (see my answer here)
My recommendation is to treat this kind of processing the same way as you would treat the populating of a scrolling view with online images - i.e. process asynchronously, use placeholders (eg the preprocessed image), and cache results for reuse.
update
in reply to your comment:
Applicable filters are applied when you extract data from the CIImage - for example, with imageWithCIImage:
[warning - this is my inference, I have not tested].
But this is not your problem... you need to process your images on a backgound thread as the processing will take time that will hold up the scrolling. Meanwhile display something else in the scrolling cell, such as a flat color or - better - the UIImage you are feeding into your CIImage for filtering. Updated the cell when the processing is done (check to see if it still needs updating, it may have scrolled offscreen by then). Save the filtered image in some kind of persistent store so that you don't need to filter it a second time, and check the cache whenever you need to display the image again before reprocessing from scratch.
Upvotes: 2