Reputation: 3975
I have a Cocoa app that is intended to capture still images from a USB microscope and then do some post-processing on them before saving them to an image file. At the moment, I am stuck trying to get from the CMSampleBufferRef
that's passed to my completionHandler
block to an NSImage
or some other representation I can manipulate and save using familiar Cocoa APIs.
I found the function imageFromSampleBuffer()
in the AVFoundation docs, which purports to convert a CMSampleBuffer
to a UIImage
(sigh), and revised it appropriately to return an NSImage
. But it does not work in this case, as the call to CMSampleBufferGetImageBuffer()
returns nil
.
Here is a log showing the CMSampleBuffer
passed to my completion block:
2012-01-21 19:38:36.293 LabCam[1402:cb0f] CMSampleBuffer 0x100335390 retainCount: 1 allocator: 0x7fff8c78620c
invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
buffer-level attachments:
com.apple.cmio.buffer_attachment.discontinuity_flags(P) = 0
com.apple.cmio.buffer_attachment.hosttime(P) = 79631546824089
com.apple.cmio.buffer_attachment.sequence_number(P) = 42
formatDescription = <CMVideoFormatDescription 0x100335220 [0x7fff782fff40]> {
mediaType:'vide'
mediaSubType:'jpeg'
mediaSpecific: {
codecType: 'jpeg' dimensions: 640 x 480
}
extensions: {<CFBasicHash 0x100335160 [0x7fff782fff40]>{type = immutable dict, count = 5,
entries =>
1 : <CFString 0x7fff773dff48 [0x7fff782fff40]>{contents = "Version"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
2 : <CFString 0x7fff773dff68 [0x7fff782fff40]>{contents = "RevisionLevel"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
3 : <CFString 0x7fff7781ab08 [0x7fff782fff40]>{contents = "CVFieldCount"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
4 : <CFString 0x7fff773dfdc8 [0x7fff782fff40]>{contents = "FormatName"} = <CFString 0x7fff76d35fb0 [0x7fff782fff40]>{contents = Photo - JPEG"}
5 : <CFString 0x7fff773dff88 [0x7fff782fff40]>{contents = "Vendor"} = <CFString 0x7fff773dffa8 [0x7fff782fff40]>{contents = "appl"}
}
}
}
sbufToTrackReadiness = 0x0
numSamples = 1
sampleTimingArray[1] = {
{PTS = {2388943236/30000 = 79631.441, rounded}, DTS = {INVALID}, duration = {3698/30000 = 0.123}},
}
sampleSizeArray[1] = {
sampleSize = 55911,
}
dataBuffer = 0x100335300
It clearly appears to contain JPEG data, but how do I get at it? (Preferably keeping the associated metadata along for the ride…)
Upvotes: 3
Views: 2490
Reputation: 3975
I eventually solved this with help from another code example. CMSampleBufferGetImageBuffer
only returns a valid result for the uncompressed, native image formats available from the camera. So to get my program to work, I had to configure the AVCaptureStillImageOutput
instance to use k32BGRAPixelFormat
instead of its default (JPEG) compressed format.
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
imageOutput = [[AVCaptureStillImageOutput alloc] init];
// Configure imageOutput for BGRA pixel format [#2].
NSNumber * pixelFormat = [NSNumber numberWithInt:k32BGRAPixelFormat];
[imageOutput setOutputSettings:[NSDictionary dictionaryWithObject:pixelFormat
forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[session addOutput:imageOutput];
Upvotes: 8