Hugo Scott-Slade
Hugo Scott-Slade

Reputation: 897

Is there a way to be get the brightness level on iOS of a stream of the camera?

I am using the iPhone/iPad camera to get a video stream and doing recognition on the stream, but with lighting changes it has a negative impact on the robustness. I have tested different settings in different light and can get it to work, but trying to get the settings to adjust at run time is what I need.

I can calculate a simple brightness check on each frame, but the camera adjusts and throws my results off. I can watch for sharp changes and run checks then, but gradual changes would throw my results off as well.

Ideally I'd be like to access the camera/EXIF data for the stream and see what it is registering the unfiltered brightness as, is there a way to do this?

(I am working for devices iOS 5 and above)

Thank you

Upvotes: 6

Views: 2630

Answers (2)

James Bush
James Bush

Reputation: 1527

Complete code, as used in my own app:

- (void)setupAVCapture {

//-- Setup Capture Session.
_session = [[AVCaptureSession alloc] init];
[_session beginConfiguration];

//-- Set preset session size.
[_session setSessionPreset:AVCaptureSessionPreset1920x1080];

//-- Creata a video device and input from that Device.  Add the input to the capture session.
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(videoDevice == nil)
    assert(0);

//-- Add the device to the session.
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error)
    assert(0);

[_session addInput:input];

//-- Create the output for the capture session.
AVCaptureVideoDataOutput * dataOutput = [[AVCaptureVideoDataOutput alloc] init];
[dataOutput setAlwaysDiscardsLateVideoFrames:YES]; // Probably want to set this to NO when recording

//-- Set to YUV420.
[dataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
                                                         forKey:(id)kCVPixelBufferPixelFormatTypeKey]]; // Necessary for manual preview

// Set dispatch to be on the main thread so OpenGL can do things with the data
[dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

[_session addOutput:dataOutput];
[_session commitConfiguration];

[_session startRunning];
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,
                                                                 sampleBuffer, kCMAttachmentMode_ShouldPropagate);
    NSDictionary *metadata = [[NSMutableDictionary alloc]
                              initWithDictionary:(__bridge NSDictionary*)metadataDict];
    CFRelease(metadataDict);
    NSDictionary *exifMetadata = [[metadata
                                   objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    self.autoBrightness = [[exifMetadata
                         objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];

    float oldMin = -4.639957; // dark
    float oldMax = 4.639957; // light
    if (self.autoBrightness > oldMax) oldMax = self.autoBrightness; // adjust oldMax if brighter than expected oldMax

    self.lumaThreshold = ((self.autoBrightness - oldMin) * ((3.0 - 1.0) / (oldMax - oldMin))) + 1.0;

    NSLog(@"brightnessValue %f", self.autoBrightness);
    NSLog(@"lumaThreshold %f", self.lumaThreshold);
}

The lumaThreshold variable is sent as a uniform variable to my fragment shader, which multiplies the Y sampler texture to find the ideal luminosity based on the brightness of the environment. Right now, it uses the back camera; I'll probably switch to the front camera, since I'm only changing the "brightness" of the screen to adjust for indoor/outdoor viewing, and the user's eyes are on the front of the camera (and not the back).

Upvotes: 1

markturnip
markturnip

Reputation: 402

Available in iOS 4.0 and above. It's possible to get EXIF information from CMSampleBufferRef.

//Import ImageIO & include framework in your project. 
#import <ImageIO/CGImageProperties.h>

In your sample buffer delegate toll-free bridging will get a NSDictionary of results from CoreMedia's CMGetAttachment.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    NSDictionary* dict = (NSDictionary*)CMGetAttachment(sampleBuffer, kCGImagePropertyExifDictionary, NULL);

Upvotes: 8

Related Questions