adit
adit

Reputation: 33674

How do I desaturate a UIImage?

Is there an easy way (or a built-in library) in iOS 5.x to desaturate a UIImage? Here's currently how I am doing it:

CGContextRef context = UIGraphicsGetCurrentContext();
    CGContextSaveGState(context);

    CGContextTranslateCTM(context, 0.0, self.bounds.size.height); // flip image right side up
    CGContextScaleCTM(context, 1.0, -1.0);

    CGContextDrawImage(context, rect, self.image.CGImage);
    CGContextSetBlendMode(context, kCGBlendModeSaturation);
    CGContextClipToMask(context, self.bounds, image.CGImage); // restricts drawing to within alpha channel
    CGContextSetRGBFillColor(context, 0.0, 0.0, 0.0, desaturation);
    CGContextFillRect(context, rect);

    CGContextRestoreGState(context); // restore state to reset blend mode

This seems to be a bit more complicated than I expected. Is there a simpler way than this?

I was thinking of Core Image, but I can't seem to find a concrete example on how to do desaturate an image using that.

Upvotes: 10

Views: 4285

Answers (1)

Brad Larson
Brad Larson

Reputation: 170317

You can do this with my open source GPUImage framework using two or three lines of code:

UIImage *inputImage = [UIImage imageNamed:@"inputimage.png"];    
GPUImageGrayscaleFilter *grayscaleFilter = [[GPUImageGrayscaleFilter alloc] init];
UIImage *grayscaleImage = [grayscaleFilter imageByFilteringImage:inputImage];

(remembering to release the filter if not building using ARC)

This reduces the image to its luminance values, desaturating it. If you want variable saturation / desaturation, you can use a GPUImageSaturationFilter. As the framework name indicates, this filtering runs on the GPU, and is faster than Core Image in almost every situation I've benchmarked on iOS as of 5.1.

Upvotes: 21

Related Questions