Ivan Kovacevic
Ivan Kovacevic

Reputation: 1332

CGContextDrawImage inconsistent byte values across devices?

I'm trying to compare two images (actually locate a smaller "sub-image" in bigger image) and I'm loading the images using the method provided below.

The code below now contains a testing for-loop which sums up all the individual byte values. What I discovered is that this sum and therefor bytes, differ, depending on which device it is being run. My question is why is that happening ?

// Black and white configuration:
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();
NSUInteger bytesPerPixel = 1;
NSUInteger bitsPerComponent = 8;
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;

// Image
CGImageRef imageRef = [[UIImage imageNamed:@"image.jpg"] CGImage];
NSUInteger imageWidth = CGImageGetWidth(imageRef);
NSUInteger imageHeight = CGImageGetHeight(imageRef);
NSUInteger imageSize = imageHeight * imageWidth * bytesPerPixel;
NSUInteger imageBytesPerRow = bytesPerPixel * imageWidth;

unsigned char *imageRawData = calloc(imageSize, sizeof(unsigned char));
CGContextRef imageContext = CGBitmapContextCreate(imageRawData, imageWidth, imageHeight, bitsPerComponent,
                                                  imageBytesPerRow, colorSpace, bitmapInfo);

// Draw the actual image to the bitmap context
CGContextDrawImage(imageContext, CGRectMake(0, 0, imageWidth, imageHeight), imageRef);
CGContextRelease(imageContext);


NSUInteger sum = 0;
for (int byteIndex = 0; byteIndex < imageSize; byteIndex++)
{
    sum += imageRawData[byteIndex];
}

NSLog(@"Sum: %i", sum); // Output on simulator:    Sum: 18492272
                        // Output on iPhone 3GS:   Sum: 18494036
                        // Output on another 3GS:  Sum: 18494015
                        // Output on iPhone 4:     Sum: 18494015


free(imageRawData);
CGColorSpaceRelease(colorSpace);

Upvotes: 0

Views: 181

Answers (2)

ipmcc
ipmcc

Reputation: 29886

Are the devices all running the same version of the OS? Another possibility (beyond colorspaces, which someone already mentioned) is that the JPG decoding libraries may be subtly different. As JPEG is a lossy image format, it's not inconceivable that different decoders would produce resulting bitmaps that were not bit-equal. It's seems reasonable to posit that, given the heavy use of images in iOS UI, that the JPG decoder is something that would be undergoing constant tuning for maximum performance.

I'd even believe it conceivable that between the same OS version running on different models of device (i.e. different processors), the results could be not bit-equal if there were multiple versions of the JPG decoder, each heavily optimized for a specific CPU, although that would not explain the difference between 2 devices of the same model, with the same OS.

You might try to re-run the experiment with an image in a lossless format.

It also may be worth pointing out that providing your own backing memory for a CGBitmapContext, but not making special allowances for word alignment is likely to lead to poor performance. For instance, you have:

NSUInteger imageBytesPerRow = bytesPerPixel * imageWidth;

If imageBytesPerRow is not a multiple of the CPU's native word length, you're going to get sub-optimal performance.

Upvotes: 2

nielsbot
nielsbot

Reputation: 16022

I assume the "device grey" color space varies by device. Try with a device independent color space.

Upvotes: 0

Related Questions