Amitg2k12
Amitg2k12

Reputation: 3805

Capture Camera Buffer in Mac/Cocoa

In my application, I need to capture the Image buffer from Camera and pass it to other end over network,

I have used following code,

-(void)startVideoSessionInSubThread{
    // Create the capture session

    pPool = [[NSAutoreleasePool alloc]init];

    mCaptureSession = [[QTCaptureSession alloc] init] ;

    // Connect inputs and outputs to the session    
    BOOL success = NO;
    NSError *error;

    // Find a video device  

    QTCaptureDevice *videoDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo];
    success = [videoDevice open:&error];


    // If a video input device can't be found or opened, try to find and open a muxed input device

    if (!success) {
        videoDevice = [QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeMuxed];
        success = [videoDevice open:&error];

    }

    if (!success) {
        videoDevice = nil;
        // Handle error


    }

    if (videoDevice) {
        //Add the video device to the session as a device input

        mCaptureVideoDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:videoDevice];
        success = [mCaptureSession addInput:mCaptureVideoDeviceInput error:&error];
        if (!success) {
            // Handle error
        }


        mCaptureDecompressedVideoOutput = [[QTCaptureDecompressedVideoOutput alloc] init];

        [mCaptureDecompressedVideoOutput setPixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:
                                                                   [NSNumber numberWithDouble:320.0], (id)kCVPixelBufferWidthKey,
                                                                   [NSNumber numberWithDouble:240.0], (id)kCVPixelBufferHeightKey,
                                                                   [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
                                                                   //   kCVPixelFormatType_32BGRA , (id)kCVPixelBufferPixelFormatTypeKey,      
                                                                   nil]];

        [mCaptureDecompressedVideoOutput setDelegate:self];

        [mCaptureDecompressedVideoOutput setMinimumVideoFrameInterval:0.0333333333333]; // to have video effect, 33 fps 

        success = [mCaptureSession addOutput:mCaptureDecompressedVideoOutput error:&error];

        if (!success) {
            [[NSAlert alertWithError:error] runModal];
            return;
        }

        [mCaptureView setCaptureSession:mCaptureSession];
        bVideoStart = NO;
        [mCaptureSession startRunning];
        bVideoStart = NO;

    }

}
-(void)startVideoSession{
    // start video from different session 
    [NSThread detachNewThreadSelector:@selector(startVideoSessionInSubThread) toTarget:self withObject:nil];
}

in the Callback function

// Do something with the buffer 
- (void)captureOutput:(QTCaptureOutput *)captureOutput didOutputVideoFrame:(CVImageBufferRef)videoFrame 
     withSampleBuffer:(QTSampleBuffer *)sampleBuffer 
       fromConnection:(QTCaptureConnection *)connection


    [self processImageBufferNew:videoFrame];

    return;
}

In the function processImageBufferNew, i am adding Image into the Queue, its a sync queue, Now there is a separate thread , to read the Queue and process the buffer,

What is happening is, if i see the log the control is coming very frequently in Capture call back so sending the frame become very slow and queue size increasing very rapidly,

Any suggestion on the design ?

I am running network thread separately , in which query queue with the oldest node, so it can be send sequentially, through the log, it seems, in a minute, more then 500 nodes getting added , its causing increasing in memory and cpu starvation.

Is there any other logic that i should use to capture the camera frame ?

Upvotes: 1

Views: 1286

Answers (1)

Michael Dautermann
Michael Dautermann

Reputation: 89509

If you can't send frames over the network as fast as they are coming in on QTCaptureDecompressedVideoOutput's captureOutput: didOutputVideoFrame: withSampleBuffer: fromConnection:] delegate method, you're going to have to start dropping frames at a certain point (when you run out of memory, when you run out of space on a fixed node array of frames to send, etc.).

I'd recommend choosing some kind of network packet transmission algorithm where dropping frames isn't so obvious or abrupt. Faster network throughput means less frames to drop. Slower networks mean more frames have to not get sent.

Upvotes: 1

Related Questions