Reputation: 12749
I have a video app that does both a live preview as well as a still image capture while the preview is running. I am using 4 textures that are pre-generated when the app loads. I am accessing the textures across three threads.
In order to make the live preview work, I had to make a Sharegroup (see below) so that the captureOutput
method could store a result in a framebuffer called FBO_OUT
. Then, in order to display to the screen I needed to access FBO_OUT
for the call to presentRenderbuffer
. If I didn't use the Sharegroup, I just got a bunch of gibberish.
CAEAGLLayer* eaglLayer = (CAEAGLLayer *)self.layer;
eaglLayer.opaque = YES;
eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], kEAGLDrawablePropertyRetainedBacking,
kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat,
nil];
oglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
offscreenContext = [[EAGLContext alloc] initWithAPI:[oglContext API] sharegroup:oglContext.sharegroup];
if (!oglContext || ![EAGLContext setCurrentContext:oglContext]) {
NSLog(@"Problem with OpenGL context.");
[self release];
return nil;
}
Periodically, inside the captureOutput
I need to call this code:
#define SHAREGROUP_CONTEXT [[[appDelegate mainViewController] oglView] offscreenContext]
if ([EAGLContext currentContext] != SHAREGROUP_CONTEXT) {
NSLog(@"setting context");
glFlush();
[EAGLContext setCurrentContext:SHAREGROUP_CONTEXT];
}
@synchronized(SHAREGROUP_CONTEXT)
{
/* process pixels */
}
glFlush(); // at end of method
This works fine, the problem, is that I'm now trying to do the same thing after taking a Still Image (while the preview is still running) via captureStillImageAsynchronouslyFromConnection
block, however I am getting gibberish again even though I've tried doing this:
AVCaptureConnection *sic = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self stillImageOutput] connections]];
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:sic
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
{
@synchronized(SHAREGROUP_CONTEXT)
{
/* generate new textures to process the imageDataSampleBuffer and cry */
}
This seems to be a problem with contexts and threading.
Upvotes: 1
Views: 303
Reputation: 12749
I've found the answer. You need to make sure the @synchronized
block is in both methods. This took me so long to find because something very strange was happening with my shaders. One of the filenames was lowercase on disk when XCode thought they were all uppercase. This made the shader work in the live preview but gave me a white screen in the still image capture portion. No idea why.
Upvotes: 1