KyleStew
KyleStew

Reputation: 420

Core Image many times slower on first render

I can't find any documentation from Apple to explain why this piece of code runs at different speeds depending on how many times its been run.

- (void)speedTest2:(CIImage*)source {
    NSTimeInterval start = CFAbsoluteTimeGetCurrent();

    CIFilter* filter = [CIFilter filterWithName:@"CIColorInvert"];
    [filter setValue:source forKey:kCIInputImageKey];

    CGImageRef cgImage = [_context createCGImage:filter.outputImage fromRect:source.extent];
    UIImage* output = [UIImage imageWithCGImage:cgImage];
    if (cgImage)
        CFRelease(cgImage);
    _source.image = output;

    NSLog(@"time: %0.3fms", 1000.0f * (CFAbsoluteTimeGetCurrent() - start));
}

Run times

The same source image is being used for every run.

I know Core Image concatenates the filter chain. Is this somehow being cached? Can I pre-cache this operation so users don't get hit with performance problems on their first app launch?

This one is making me crazy :(

Upvotes: 4

Views: 1238

Answers (2)

signal
signal

Reputation: 225

There are three ways to create context to draw outputImgae; contextWithOptions: this create on GPU or CPu which based on you deveice; contextWithEAGLContext:; contextWithEAGLContext: options: created on GPU; look at Core Image Programming Guide;

Upvotes: -2

isoiphone
isoiphone

Reputation: 312

A portion of the overhead may be the image library itself loading. If the effects are implemented as pixel shaders, there may well be a compilation step going on behind the scenes.

This hidden cost is unavoidable, but you can choose to do it at a more convenient time. For example when the application is loading.

I would suggest loading a small image (1x1 px) and applying some effects to it during load to see if it helps.

You may also want to try the official Apple forums for a response.

Upvotes: 6

Related Questions