Riley Testut
Riley Testut

Reputation: 461

Core Image Auto Adjustments Rendering MUCH Too Slowly on CPU

Understandably, when rendering on the CPU as opposed to the GPU, the rendering will take much more time. However, photos taken with the iPhone 4's camera are too large to render with the GPU, so they must be rendered with the CPU. This works well for Core Image filters, except the filters returned from autoAdjustmentFiltersWithOptions: When rendering a CIImage modified with these filters, it takes 40+ seconds, as opposed to a split second with the GPU.

Steps to Reproduce:

  1. Create a CIImage with an image larger than 2048x2048 on an iPhone 4, or 4096x4096 on iPhone 4S.
  2. Call the method autoAdjustmentFiltersWithOptions: on the CIImage.
  3. Apply the returned filters to the CIImage.
  4. Render the CIImage to a CGImageRef.

Expected Results: The image takes a few seconds longer than it would take when using the GPU to render.

Actual Results: It takes upwards of 40 seconds to render.

Notes: The Photos app can enhance large photos much faster than this method. Shows that the iPhone 4/4S's hardware is capable of achieving this, regardless of whether the Photos app uses private APIs or not.

Anyone have any advice?

Upvotes: 1

Views: 771

Answers (1)

akaru
akaru

Reputation: 6317

autoAdjustmentFiltersWithOptions uses the CPU to determine the applied filters. Try downscaling the image before calling that, then apply the filters to the original image. Also, consider turning off red eye detection if you don't need it.

Upvotes: 1

Related Questions