Cari95
Cari95

Reputation: 313

Color detection at a distance (Swift)

Can someone point me in the right direction with color detection at a distance? I have used the code below and it grabs RBG values of an image properly if an object or point of interest is less than 10 feet away. When the object is at a distance the code returns the wrong values. I want to take a picture of an object at a distance greater than 10 feet and detect the color of that image.

 //On the top of your swift 
  extension UIImage {
  func getPixelColor(pos: CGPoint) -> UIColor {

      let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(self.CGImage))
      let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)

      let pixelInfo: Int = ((Int(self.size.width) * Int(pos.y)) + Int(pos.x)) * 4

      let r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
      let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
      let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
      let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)

      return UIColor(red: r, green: g, blue: b, alpha: a)
  }  
  }

Upvotes: 2

Views: 3335

Answers (2)

Digvijay Gida
Digvijay Gida

Reputation: 89

read the average color of a UIImage ==> https://www.hackingwithswift.com/example-code/media/how-to-read-the-average-color-of-a-uiimage-using-ciareaaverage

extension UIImage {
    var averageColor: UIColor? {
    guard let inputImage = CIImage(image: self) else { return nil }
    let extentVector = CIVector(x: inputImage.extent.origin.x, y: inputImage.extent.origin.y, z: inputImage.extent.size.width, w: inputImage.extent.size.height)

    guard let filter = CIFilter(name: "CIAreaAverage", parameters: [kCIInputImageKey: inputImage, kCIInputExtentKey: extentVector]) else { return nil }
    guard let outputImage = filter.outputImage else { return nil }

    var bitmap = [UInt8](repeating: 0, count: 4)
    let context = CIContext(options: [.workingColorSpace: kCFNull])
    context.render(outputImage, toBitmap: &bitmap, rowBytes: 4, bounds: CGRect(x: 0, y: 0, width: 1, height: 1), format: .RGBA8, colorSpace: nil)

    return UIColor(red: CGFloat(bitmap[0]) / 255, green: CGFloat(bitmap[1]) / 255, blue: CGFloat(bitmap[2]) / 255, alpha: CGFloat(bitmap[3]) / 255)
    }
}

Upvotes: 0

R Menke
R Menke

Reputation: 8401

I am a photographer and what you are trying to do is very similar to setting a white balance in post processing or using the color picker in PS.

Digital Camera's don't have pixels that capture the full spectrum of light at once. They have triplets of pixels for RGB. The captured information is interpolated and this can give very bad results. Setting the white balance in post on an image taken at night is almost impossible.

Reasons for bad interpolation:

  • Pixels are bigger than the smallest discernible object in the scene. (moiré artifacts)
  • Low light situation where Digital Gain increases color differences. (color noise artifacts)
  • Image was converted to low quality jpg but has lots of edges. (jpg artifacts)

If it is a low quality jpg, get a better source img.

Fix

All you have to do to get a more accurate reading, is blur the image. The smallest acceptable blur is 3 pixels, because this will undo some of the interpolation. Bigger blurs might be better.

Since blurs are expensive it is best to crop the image to a multiple of the blur radius. You can't take a precise fit because it will also blur the edges and beyond the edges the image is black. This will influence your reading.

It might be best if you also enforce an upper limit on the blur radius.


Shortcut to get the center of something with a size.

extension CGSize {

    var center : CGPoint {
        get {
            return CGPoint(x: width / 2, y: height / 2)
        }
    }
}

The UIImage stuff

extension UIImage {

    func blur(radius: CGFloat) -> UIImage? {
        // extensions of UImage don't know what a CIImage is...
        typealias CIImage = CoreImage.CIImage

        // blur of your choice
        guard let blurFilter = CIFilter(name: "CIBoxBlur") else {
            return nil
        }

        blurFilter.setValue(CIImage(image: self), forKey: kCIInputImageKey)
        blurFilter.setValue(radius, forKey: kCIInputRadiusKey)

        let ciContext  = CIContext(options: nil)

        guard let result = blurFilter.valueForKey(kCIOutputImageKey) as? CIImage else {
            return nil
        }

        let blurRect = CGRect(x: -radius, y: -radius, width: self.size.width + (radius * 2), height: self.size.height + (radius * 2))

        let cgImage = ciContext.createCGImage(result, fromRect: blurRect)

        return UIImage(CGImage: cgImage)

    }

    func crop(cropRect : CGRect) -> UIImage? {

        guard let imgRef = CGImageCreateWithImageInRect(self.CGImage, cropRect) else {
            return nil
        }
        return UIImage(CGImage: imgRef)

    }

    func getPixelColor(atPoint point: CGPoint, radius:CGFloat) -> UIColor? {

        var pos = point
        var image = self

        // if the radius is too small -> skip
        if radius > 1 {

            let cropRect = CGRect(x: point.x - (radius * 4), y: point.y - (radius * 4), width: radius * 8, height: radius * 8)
            guard let cropImg = self.crop(cropRect) else {
                return nil
            }

            guard let blurImg = cropImg.blur(radius) else {
                return nil
            }

            pos = blurImg.size.center
            image = blurImg

        }

        let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage))
        let data: UnsafePointer<UInt8> = CFDataGetBytePtr(pixelData)

        let pixelInfo: Int = ((Int(image.size.width) * Int(pos.y)) + Int(pos.x)) * 4

        let r = CGFloat(data[pixelInfo]) / CGFloat(255.0)
        let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0)
        let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0)
        let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0)

        return UIColor(red: r, green: g, blue: b, alpha: a)
    }  
}

Side note :

Your problem might not be the color grabbing function but how you set the point. If you are doing it by touch and the object is farther and thus smaller on the screen, you might not set it accurately enough.

Upvotes: 6

Related Questions