Paul S.
Paul S.

Reputation: 1352

How to get WhiteBalance (Kelvin) from captureOutput

Apple has various ways to change and view the Kelvin of using AVCaptureDevice

https://developer.apple.com/documentation/avfoundation/avcapturedevice/white_balance

Example:

    guard let videoDevice = AVCaptureDevice
        .default(.builtInWideAngleCamera, for: .video, position: .back) else {
        return
    }
    
   
    guard
        let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
        captureSession.canAddInput(videoDeviceInput) else {

        print("There seems to be a problem with the camera on your device.")
        
        return
    }
    
    captureSession.addInput(videoDeviceInput)
    
    let kelvin = videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains)
    print("Kelvin temp \(kelvin.temperature)")
    print("Kelvin tint \(kelvin.tint)")
    
    let captureOutput = AVCaptureVideoDataOutput()
    
    captureOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]
    captureOutput.setSampleBufferDelegate(self, queue: DispatchQueue.global(qos: DispatchQoS.QoSClass.default))
    
    captureSession.addOutput(captureOutput)

This will always return

Kelvin temp 3900.0889
Kelvin tint 4.966322

How can I get the White Balance (Kelvin value) through the live camera feed?

Upvotes: 0

Views: 155

Answers (1)

Salvo89
Salvo89

Reputation: 33

It is giving you one value because videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains) is called only once. To get an updating value following the camera feed, you have two options:

  1. Key-value observing, which will notify when the WB changes
  2. Call the function videoDevice.temperatureAndTintValues(for: videoDevice.deviceWhiteBalanceGains) for each frame.

I would suggest you use the second, key-value observing is somewhat annoying. In that case, I guess you already have implemented the method func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) of the AVCaptureVideoDataOutputSampleBufferDelegate. That method is called every time a frame is returned, so, you can update your videoDevice.temperatureAndTintValues for each frame of the live camera feed.

For the key-value observing, you first setup the observer (e.g in viewDidAppear), for example:

func addObserver() {
self.addObserver(self, forKeyPath: "videoDevice.deviceWhiteBalanceGains", options: .new, context: &DeviceWhiteBalanceGainsContext)
}

Keep a reference to the videoDevice, declaring it this way:

@objc dynamic var videoDevice : AVCaptureDevice!

Then @objc and dynamic are needed for the key-value observing.

Now you can implement this function, which will be called every time the observed value changes:

override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {

guard let context = context else {
                super.observeValue(forKeyPath: keyPath, of: object, change: change, context: nil)
                return
            }

if context == &DeviceWhiteBalanceGainsContext {
// do your work on WB here
} 
}

Finally, you can define the context this way (I have it outside my ViewController):

private var DeviceWhiteBalanceGainsContext = 0

I have implemented both methods in my apps and they both work well.

WARNING: sometimes, the WB values will be outside the allowed range (especially at startup) and the API raises an exception. Make sure to handle this otherwise, the app will crash.

Upvotes: 0

Related Questions