digit
digit

Reputation: 197

inconsistencies in colors when drawing

I have a UIImageView and I draw to it using UIColor orangeColor. Now, I have a function that is supposed to detect the pixelColor of a pixel tapped on.

R: 1.000000 G: 0.501961 B: 0.000000

That's the RGB value I receive when attempting to detect the pixelColor for UIOrange

It should be.

R: 1.000000 G: 0.5 B: 0.000000

Here's my function

- (UIColor *)colorAtPixel:(CGPoint)point {
    // Cancel if point is outside image coordinates
    if (!CGRectContainsPoint(CGRectMake(0.0f, 0.0f, _overlay_imageView.frame.size.width, _overlay_imageView.frame.size.height), point)) {
        return nil;
    }


    // Create a 1x1 pixel byte array and bitmap context to draw the pixel into.
    // Reference: http://stackoverflow.com/questions/1042830/retrieving-a-pixel-alpha-value-for-a-uiimage
    NSInteger pointX = trunc(point.x);
    NSInteger pointY = trunc(point.y);
    CGImageRef cgImage = _overlay_imageView.image.CGImage;
    NSUInteger width = CGImageGetWidth(cgImage);
    NSUInteger height = CGImageGetHeight(cgImage);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    int bytesPerPixel = 4;
    int bytesPerRow = bytesPerPixel * 1;
    NSUInteger bitsPerComponent = 8;
    unsigned char pixelData[4] = { 0, 0, 0, 0 };
    CGContextRef context = CGBitmapContextCreate(pixelData,
                                                 1,
                                                 1,
                                                 bitsPerComponent,
                                                 bytesPerRow,
                                                 colorSpace,
                                                 kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);
    CGContextSetBlendMode(context, kCGBlendModeCopy);

    // Draw the pixel we are interested in onto the bitmap context
    CGContextTranslateCTM(context, -pointX, -pointY);
    CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, (CGFloat)width, (CGFloat)height), cgImage);
    CGContextRelease(context);

    // Convert color values [0..255] to floats [0.0..1.0]
    CGFloat red   = (CGFloat)pixelData[0] / 255.0f;
    CGFloat green = (CGFloat)pixelData[1] / 255.0f;
    CGFloat blue  = (CGFloat)pixelData[2] / 255.0f;
    CGFloat alpha = (CGFloat)pixelData[3] / 255.0f;

    return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
}

Any ideas?

I must mention, my UIImageView has a clearBackground, and its ontop of a black canvas. Is that maybe the issue?

Upvotes: 0

Views: 39

Answers (1)

Josh Homann
Josh Homann

Reputation: 16327

There's nothing wrong with your function. This is a a result of floating point math. Half of an integer 255 (the max value of an unsigned byte) is either 127/255.0 or 128/255.0 depending on how you round. Neither one of those is 0.5. They are 0.498039215686275 and 0.501960784313725 respectively.

EDIT: I guess I should add that the colors in the CGImage are stored as bytes, not floats. So when you create your orange with a float in UIColor its getting truncated to R:255, G:128, B:0 A:255. When you read this back as a float you get 1.0 0.501961 B: 0.0 A: 1.0

Upvotes: 1

Related Questions