Reputation: 4211
How can I clear out the magenta part of an UIImage and make it transparent?
I've looked through numerous answers and links on SO and nothing works (e.g. How to make one color transparent on a UIImage? answer 1 removes everything but red, answer 2 apparently doesn't work because of Why is CGImageCreateWithMaskingColors() returning nil in this case?).
Update:
If I use CGImageCreateWithMaskingColors with the UIImage I get a nil value. If I remove the alpha channel (I represent the image as JPEG and read it back) CGImageCreateWithMaskingColors returns an image painted with a black background.
Update2, the code:
Returning nil:
const float colorMasking[6] = {222, 255, 222, 255, 222, 255};
CGImageRef imageRef = CGImageCreateWithMaskingColors(anchorWithMask.CGImage, colorMasking);
NSLog(@"image ref %@", imageRef);
// this is going to return a nil imgref.
UIImage *image = [UIImage imageWithCGImage:imageRef];
Returning an image with black background (which is normal since there is not alpha channel):
UIImage *inputImage = [UIImage imageWithData:UIImageJPEGRepresentation(anchorWithMask, 1.0)];
const float colorMasking[6] = {222, 255, 222, 255, 222, 255};
CGImageRef imageRef = CGImageCreateWithMaskingColors(inputImage.CGImage, colorMasking);
NSLog(@"image ref %@", imageRef);
// imgref is NOT nil.
UIImage *image = [UIImage imageWithCGImage:imageRef];
Update3:
I got it working by adding the alpha channel after the masking process.
Upvotes: 10
Views: 17870
Reputation: 257
I did this in CIImage with the post-processing from Vision being a pixelBuffer:
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
let filteredImage = ciImage.applyingFilter("CIMaskToAlpha")
self.picture.image = UIImage(ciImage: filteredImage)
Upvotes: 0
Reputation: 16552
I made this static function that removes the white background you can use it replacing the mask with the color range you want to remove:
+ (UIImage*) processImage :(UIImage*) image
{
const float colorMasking[6]={222,255,222,255,222,255};
CGImageRef imageRef = CGImageCreateWithMaskingColors(image.CGImage, colorMasking);
UIImage* imageB = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return imageB;
}
Upvotes: 4
Reputation: 11801
UIImage *image = [UIImage imageNamed:@"image.png"];
const float colorMasking[6] = {1.0, 1.0, 0.0, 0.0, 1.0, 1.0};
image = [UIImage imageWithCGImage: CGImageCreateWithMaskingColors(image.CGImage, colorMasking)];
You receive nil, because parameters, that you send is invalid. If you open Apple documentation, you will see this:
Components
An array of color components that specify a color or range of colors to mask the image with. The array must contain 2N values { min[1], max[1], ... min[N], max[N] } where N is the number of components in color space of image. Each value in components must be a valid image sample value. If image has integer pixel components, then each value must be in the range [0 .. 2**bitsPerComponent - 1] (where bitsPerComponent is the number of bits/component of image). If image has floating-point pixel components, then each value may be any floating-point number which is a valid color component.
You can quickly open documentation by holding Option + Mouse Click on some function or class, like CGImageCreateWithMaskingColors.
Upvotes: 4