Liam Vinson
Liam Vinson

Reputation: 81

Retrieving pixel data not working correctly with UIGraphicsImageRenderer image

I have the following code to get the pixel data from a UIImage, this works for most images however does not work when I create an image using the UIGraphicsImageRenderer. I was hoping someone knew a solution to this.

My current code generates a simple image but then accessing the data gives unexpected results.

func myDraw() {

    let renderer = UIGraphicsImageRenderer(size: CGSize(width: 200, height: 200))
    let image = renderer.image { context in

        context.cgContext.setFillColor(UIColor.black.cgColor)
        context.cgContext.addRect(CGRect(x: 0, y: 0, width: 100, height: 100))
        context.cgContext.fillPath()

        context.cgContext.setFillColor(UIColor.red.cgColor)
        context.cgContext.addRect(CGRect(x: 100, y: 100, width: 100, height: 100))
        context.cgContext.fillPath()

    }

    let providerData = image.cgImage!.dataProvider!.data
    let data = CFDataGetBytePtr(providerData)!
    var pixels = [PixelData]()
    for i in stride(from: 0, to: 160000-1, by: 4) {
        pixels.append(PixelData(a:data[i+3], r:data[i+0], g:data[i+1], b:data[i+2]))
    }
    self.canvas.image = self.imageFromARGB32Bitmap(pixels: pixels, width: 200, height: 200)

}

I have used the following code to generate the image to see if it was working correctly.

func imageFromARGB32Bitmap(pixels: [PixelData], width: Int, height: Int) -> UIImage? {
    guard width > 0 && height > 0 else { return nil }
    guard pixels.count == width * height else { return nil }

    let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
    let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue)
    let bitsPerComponent = 8
    let bitsPerPixel = 32

    var data = pixels // Copy to mutable []
    guard let providerRef = CGDataProvider(data: NSData(bytes: &data,
                                                        length: data.count * MemoryLayout<PixelData>.size)
        )
        else { return nil }

    guard let cgim = CGImage(
        width: width,
        height: height,
        bitsPerComponent: bitsPerComponent,
        bitsPerPixel: bitsPerPixel,
        bytesPerRow: width * MemoryLayout<PixelData>.size,
        space: rgbColorSpace,
        bitmapInfo: bitmapInfo,
        provider: providerRef,
        decode: nil,
        shouldInterpolate: true,
        intent: .defaultIntent
        )
        else { return nil }

    return UIImage(cgImage: cgim)
}

Upvotes: 1

Views: 1165

Answers (2)

JoShin
JoShin

Reputation: 71

use render.pngData instead render.image, then get image from UIImage.init(data: pngData)

I met the same problem when I wanted to get pixel color from UIGraphicsImageRenderer generated image.

the code:

  let render = UIGraphicsImageRenderer(size: .init(width: 414, height: 100))
  // iPhone Xs, so scale is 3, the img actual size is 1242x300
  let img = render.image { ctx in
      ctx.cgContext.setFillColor(UIColor.red.cgColor)
      ctx.cgContext.addRect(.init(x: 0, y: 0, width: 10, height: 10))
      ctx.cgContext.drawPath(using: .fill)
  }

print img pixel color information:

 let pixelData = img.cgImage!.dataProvider!.data
 let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)
 let width = img.cgImage!.width
 let height = img.cgImage!.height

 for y in 0..<height {
     for x in 0..<width {
         let pixelInfo = (width * y + x) * 4

         print("xy: \(x) \(y) rgba: \(data[pixelInfo]) \(data[pixelInfo + 1]) \(data[pixelInfo + 2]) \(data[pixelInfo + 3])")
     }
 }

the print result:

 xy: 0 0 rgba: 255 0 0 255
 xy: 1 0 rgba: 255 0 0 255
 xy: 2 0 rgba: 255 0 0 255
 xy: 3 0 rgba: 255 0 0 255
 xy: 4 0 rgba: 255 0 0 255
 ...
 xy: 1240 0 rgba: 0 0 0 0
 xy: 1241 0 rgba: 0 0 0 0
 xy: 0 1 rgba: 0 0 0 0
 ...
 xy: 5 1 rgba: 0 0 0 0
 xy: 6 1 rgba: 255 0 0 255
 xy: 7 1 rgba: 255 0 0 255
 xy: 8 1 rgba: 255 0 0 255
 ...
 xy: 1240 1 rgba: 0 0 0 0
 xy: 1241 1 rgba: 0 0 0 0
 xy: 0 2 rgba: 0 0 0 0
 xy: 1 2 rgba: 0 0 0 0
 ...     
 xy: 11 2 rgba: 0 0 0 0
 xy: 12 2 rgba: 255 0 0 255
 ...

I found that there is some offsets from second row, and when I changed img width 414 to 400, the offsets were gone.

I don't know the reason, and if someone knew, please add a comment below my answer.

Upvotes: 0

Rob
Rob

Reputation: 437882

A few observations:

  1. Your code assumes that UIGraphicsImageRenderer generates images with scale of 1, whereas it defaults to 0 (i.e. whatever scale your device uses).

    Instead, force the scale to 1:

    let format = UIGraphicsImageRendererFormat()
    format.scale = 1
    let renderer = UIGraphicsImageRenderer(size: CGSize(width: 200, height: 200), format: format)
    
  2. It’s not the issue here, but we must note that your code just assumes that the format of UIGraphicsImageRendererFormat will be a particular byte order and format, as does your imageFromARGB32Bitmap. If you look at Apple Technical Note 1509 (from which your code was undoubtedly originally adapted), they don’t just assume that the buffer will be in a particular format. When we want to manipulate/examine a buffer, we should (a) create a context of the desired format, (b) draw our image (or whatever) to that context, and only then can we reliably look at the provider data.

  3. The imageFromARGB32Bitmap works, but it makes me a bit nervous.

    • The use of MemoryLayout<PixelData>.size: Apple advises :

      When allocating memory for multiple instances of T using an unsafe pointer, use a multiple of the type’s stride instead of its size.

      So, I’d use stride.

    • What if stride wasn’t 4 like you expect it to be? I can’t imagine it would ever not be 4, but with the data provider assumes that they will be packed in. It’s a minor observation, but I might make this assumption explicit.

    • Are we 100% assured that dereferencing &data will give us a contiguous buffer? I’d lean towards withContiguousStorageIfAvailable just to be safe.
       

    For example:

    func imageFromARGB32Bitmap(pixels: [PixelData], width: Int, height: Int) -> UIImage? {
        guard width > 0,
            height > 0,
            pixels.count == width * height else { return nil }
    
        let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue)
        let bitsPerComponent = 8
        let bitsPerPixel = 32
    
        let stride = MemoryLayout<PixelData>.stride
        assert(stride == 4)
    
        return pixels.withContiguousStorageIfAvailable { bufferPointer -> UIImage? in
            let data = Data(buffer: bufferPointer)
    
            return CGDataProvider(data: data as CFData)
                .flatMap { CGImage(width: width,
                                   height: height,
                                   bitsPerComponent: bitsPerComponent,
                                   bitsPerPixel: bitsPerPixel,
                                   bytesPerRow: width * stride,
                                   space: rgbColorSpace,
                                   bitmapInfo: bitmapInfo,
                                   provider: $0,
                                   decode: nil,
                                   shouldInterpolate: true,
                                   intent: .defaultIntent) }
                .flatMap { UIImage(cgImage: $0) }
            } ?? nil
    }
    

Upvotes: 3

Related Questions