Reputation: 19814
I'm using AVFoundation to capture information from the camera, passing it to CIFilter to replace darker pixels with pixels with an alpha level of 0 and displaying it on a GLKView.
The containing view controller is a GLKViewController but the meat of the program starts with the function:
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
@autoreleasepool {
CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer options:[NSDictionary dictionaryWithObject:(id)kCFNull forKey:kCIImageColorSpace]];
image = [image imageByApplyingTransform:CGAffineTransformMakeRotation(-M_PI/2.0)];
CGPoint origin = [image extent].origin;
image = [image imageByApplyingTransform:CGAffineTransformMakeTranslation(-origin.x, -origin.y)];
[self.replaceDarkColorWithTransparentFilter setValue:image forKey:@"inputImage"];
image = self.replaceDarkColorWithTransparentFilter.outputImage;
[self.ciContext drawImage:image inRect:CGRectMake(0.0f, 0.0f, 480.0f, 640.0f) fromRect:[image extent]];
[self.eaglContext presentRenderbuffer:GL_RENDERBUFFER];
[self.replaceDarkColorWithTransparentFilter setValue:nil forKey:@"inputImage"];
}
}
In the GLKViewController, I initialize the render buffer using:
glGenRenderbuffers(1, &_renderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);
The result is that older images are not being cleared and showing through. The result looks like this: http://d.pr/i/M02z Is this normal? Shouldn't it be clearing out the old context and starting fresh?
I've tried glFlush();
at update and glkView:(GLKView *)view drawInRect:(CGRect)rect
and trying to draw nil for the Image in the CIContext to no avail.
Is there something I'm missing?
Upvotes: 2
Views: 2231
Reputation: 170319
You're going to need to clear out the previous contents of your framebuffer using code like
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
at some point before you draw the new Core Image frame. This will set the contents and alpha channel of the framebuffer to 0.0 to give you a clean slate on which to draw your filtered image.
You might also need to set the current OpenGL ES context using something like
[EAGLContext setCurrentContext:self.openGLESContext];
right before that, because I can't remember if Core Image will leave that set properly for whatever thread you're working on.
Upvotes: 1