Zoyt
Zoyt

Reputation: 4967

Get average color of UIImage in Swift

I was recently attempting to convert the code from here into Swift. However, I keep getting a white color, no matter the image. Here's my code:

// Playground - noun: a place where people can play

import UIKit

extension UIImage {
    func averageColor() -> UIColor {
        var colorSpace = CGColorSpaceCreateDeviceRGB()
        var rgba: [CGFloat] = [0,0,0,0]
        var context = CGBitmapContextCreate(&rgba, 1, 1, 8, 4, colorSpace, CGBitmapInfo.fromRaw(CGImageAlphaInfo.PremultipliedLast.toRaw())!)
        rgba

        CGContextDrawImage(context, CGRectMake(0, 0, 1, 1), self.CGImage)

        if rgba[3] > 0 {
            var alpha = rgba[3] / 255
            var multiplier = alpha / 255
            return UIColor(red: rgba[0] * multiplier, green: rgba[1] * multiplier, blue: rgba[2] * multiplier, alpha: alpha)
        } else {
            return UIColor(red: rgba[0] / 255, green: rgba[1] / 255, blue: rgba[2] / 255, alpha: rgba[3] / 255)
        }
    }
}

var img = UIImage(data: NSData(contentsOfURL: NSURL(string: "http://upload.wikimedia.org/wikipedia/commons/c/c3/Aurora_as_seen_by_IMAGE.PNG")))

img.averageColor()

Thanks in advance.

Upvotes: 14

Views: 19619

Answers (5)

God of Biscuits
God of Biscuits

Reputation: 1338

CoreImage in iOS 9: use the CIAreaAverage filter and pass the extent of your entire image to be averaged.

Plus, it's much faster since it'll either be running on the GPU or as a highly-optimized CPU CIKernel.

Upvotes: 18

Carmelo Gallo
Carmelo Gallo

Reputation: 283

Here's a solution:

func averageColor() -> UIColor {

    let rgba = UnsafeMutablePointer<CUnsignedChar>.alloc(4)
    let colorSpace: CGColorSpaceRef = CGColorSpaceCreateDeviceRGB()
    let info = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
    let context: CGContextRef = CGBitmapContextCreate(rgba, 1, 1, 8, 4, colorSpace, info)

    CGContextDrawImage(context, CGRectMake(0, 0, 1, 1), self.CGImage)

    if rgba[3] > 0 {

        let alpha: CGFloat = CGFloat(rgba[3]) / 255.0
        let multiplier: CGFloat = alpha / 255.0

        return UIColor(red: CGFloat(rgba[0]) * multiplier, green: CGFloat(rgba[1]) * multiplier, blue: CGFloat(rgba[2]) * multiplier, alpha: alpha)

    } else {

        return UIColor(red: CGFloat(rgba[0]) / 255.0, green: CGFloat(rgba[1]) / 255.0, blue: CGFloat(rgba[2]) / 255.0, alpha: CGFloat(rgba[3]) / 255.0)
    }
}

Upvotes: 9

Etan
Etan

Reputation: 17544

import UIKit

extension UIImage {
    func areaAverage() -> UIColor {
        var bitmap = [UInt8](count: 4, repeatedValue: 0)

        if #available(iOS 9.0, *) {
            // Get average color.
            let context = CIContext()
            let inputImage = CIImage ?? CoreImage.CIImage(CGImage: CGImage!)
            let extent = inputImage.extent
            let inputExtent = CIVector(x: extent.origin.x, y: extent.origin.y, z: extent.size.width, w: extent.size.height)
            let filter = CIFilter(name: "CIAreaAverage", withInputParameters: [kCIInputImageKey: inputImage, kCIInputExtentKey: inputExtent])!
            let outputImage = filter.outputImage!
            let outputExtent = outputImage.extent
            assert(outputExtent.size.width == 1 && outputExtent.size.height == 1)

            // Render to bitmap.
            context.render(outputImage, toBitmap: &bitmap, rowBytes: 4, bounds: CGRect(x: 0, y: 0, width: 1, height: 1), format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())
        } else {
            // Create 1x1 context that interpolates pixels when drawing to it.
            let context = CGBitmapContextCreate(&bitmap, 1, 1, 8, 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo.ByteOrderDefault.rawValue | CGImageAlphaInfo.PremultipliedLast.rawValue)!
            let inputImage = CGImage ?? CIContext().createCGImage(CIImage!, fromRect: CIImage!.extent)

            // Render to bitmap.
            CGContextDrawImage(context, CGRect(x: 0, y: 0, width: 1, height: 1), inputImage)
        }

        // Compute result.
        let result = UIColor(red: CGFloat(bitmap[0]) / 255.0, green: CGFloat(bitmap[1]) / 255.0, blue: CGFloat(bitmap[2]) / 255.0, alpha: CGFloat(bitmap[3]) / 255.0)
        return result
    }
}

Swift 3

func areaAverage() -> UIColor {
        var bitmap = [UInt8](repeating: 0, count: 4)

        if #available(iOS 9.0, *) {
            // Get average color.
            let context = CIContext()
            let inputImage: CIImage = ciImage ?? CoreImage.CIImage(cgImage: cgImage!)
            let extent = inputImage.extent
            let inputExtent = CIVector(x: extent.origin.x, y: extent.origin.y, z: extent.size.width, w: extent.size.height)
            let filter = CIFilter(name: "CIAreaAverage", withInputParameters: [kCIInputImageKey: inputImage, kCIInputExtentKey: inputExtent])!
            let outputImage = filter.outputImage!
            let outputExtent = outputImage.extent
            assert(outputExtent.size.width == 1 && outputExtent.size.height == 1)

            // Render to bitmap.
            context.render(outputImage, toBitmap: &bitmap, rowBytes: 4, bounds: CGRect(x: 0, y: 0, width: 1, height: 1), format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())
        } else {
            // Create 1x1 context that interpolates pixels when drawing to it.
            let context = CGContext(data: &bitmap, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: CGColorSpaceCreateDeviceRGB(), bitmapInfo: CGImageAlphaInfo.premultipliedLast.rawValue)!
            let inputImage = cgImage ?? CIContext().createCGImage(ciImage!, from: ciImage!.extent)

            // Render to bitmap.
            context.draw(inputImage!, in: CGRect(x: 0, y: 0, width: 1, height: 1))
        }

        // Compute result.
        let result = UIColor(red: CGFloat(bitmap[0]) / 255.0, green: CGFloat(bitmap[1]) / 255.0, blue: CGFloat(bitmap[2]) / 255.0, alpha: CGFloat(bitmap[3]) / 255.0)
        return result
    }

Upvotes: 15

Alessign
Alessign

Reputation: 788

Swift 3:

func areaAverage() -> UIColor {

    var bitmap = [UInt8](repeating: 0, count: 4)

    let context = CIContext(options: nil)
    let cgImg = context.createCGImage(CoreImage.CIImage(cgImage: self.cgImage!), from: CoreImage.CIImage(cgImage: self.cgImage!).extent)

    let inputImage = CIImage(cgImage: cgImg!)
    let extent = inputImage.extent
    let inputExtent = CIVector(x: extent.origin.x, y: extent.origin.y, z: extent.size.width, w: extent.size.height)
    let filter = CIFilter(name: "CIAreaAverage", withInputParameters: [kCIInputImageKey: inputImage, kCIInputExtentKey: inputExtent])!
    let outputImage = filter.outputImage!
    let outputExtent = outputImage.extent
    assert(outputExtent.size.width == 1 && outputExtent.size.height == 1)

    // Render to bitmap.
    context.render(outputImage, toBitmap: &bitmap, rowBytes: 4, bounds: CGRect(x: 0, y: 0, width: 1, height: 1), format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())

    // Compute result.
    let result = UIColor(red: CGFloat(bitmap[0]) / 255.0, green: CGFloat(bitmap[1]) / 255.0, blue: CGFloat(bitmap[2]) / 255.0, alpha: CGFloat(bitmap[3]) / 255.0)
    return result
}

Upvotes: 8

Janie Larson
Janie Larson

Reputation: 504

Are you setting up your context correctly? If I look at the documentation for the CGBitmapContext Reference:

https://developer.apple.com/library/ios/documentation/graphicsimaging/Reference/CGBitmapContext/index.html#//apple_ref/c/func/CGBitmapContextCreate

it looks like you are only allocating enough memory for the image that would be contained in the CGFloat array. It also looks like you are telling the compiler that your image is only going to be one pixel by one pixel.

It looks like that size is also being confirmed as one pixel by one pixel when you are setting your CGRect in CGContextDrawImage.

If the Playground is only creating an image one pixel by one pixel, that would explain why you are only seeing a white screen.

Upvotes: 0

Related Questions