Reputation: 461
Understandably, when rendering on the CPU as opposed to the GPU, the rendering will take much more time. However, photos taken with the iPhone 4's camera are too large to render with the GPU, so they must be rendered with the CPU. This works well for Core Image filters, except the filters returned from autoAdjustmentFiltersWithOptions: When rendering a CIImage modified with these filters, it takes 40+ seconds, as opposed to a split second with the GPU.
Steps to Reproduce:
Expected Results: The image takes a few seconds longer than it would take when using the GPU to render.
Actual Results: It takes upwards of 40 seconds to render.
Notes: The Photos app can enhance large photos much faster than this method. Shows that the iPhone 4/4S's hardware is capable of achieving this, regardless of whether the Photos app uses private APIs or not.
Anyone have any advice?
Upvotes: 1
Views: 771
Reputation: 6317
autoAdjustmentFiltersWithOptions
uses the CPU to determine the applied filters. Try downscaling the image before calling that, then apply the filters to the original image. Also, consider turning off red eye detection if you don't need it.
Upvotes: 1