Reputation: 1
I need to read a colour of a pixel located at a point in an image and the code I have works in simulator for all iPhones (including iPhone 6 Plus) except iPhone 6.
I do not know why, but my guess is the index of the pixel is not correct since it detects a colour in a wrong location. I appreciate any help. This is the code that I have.
UIGraphicsBeginImageContext(upperCaseView.frame.size)
upperCaseView.layer.renderInContext(UIGraphicsGetCurrentContext()!)
snapshotImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(snapshotImage.CGImage))
let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)
let pixelInfo: Int = ((Int(snapshotImage.size.width) * Int(point.y)) + Int(point.x)) * 4
print(data[pixelInfo])
print(data[pixelInfo+1])
print(data[pixelInfo+2])
Upvotes: 0
Views: 89
Reputation: 1
Thank you so much Scott Thompson.
I used CGImageGetBytesPerRow instead of the image width and now it works. The correct code is below:
UIGraphicsBeginImageContext(upperCaseView.frame.size)
upperCaseView.layer.renderInContext(UIGraphicsGetCurrentContext()!)
snapshotImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(snapshotImage.CGImage))
let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)
let bytesPerPixel = (CGImageGetBitsPerPixel(snapshotImage.CGImage) / 8)
let bytesPerRow = CGImageGetBytesPerRow(snapshotImage.CGImage)
let pixelInfo: Int = (bytesPerRow * Int(point.y)) + (Int(point.x) * bytesPerPixel)
print(data[pixelInfo])
print(data[pixelInfo+1])
print(data[pixelInfo+2])
Upvotes: 0