Reputation: 1873
I use such code to setup my framebuffer:
glGenRenderbuffers(1, &colorBuffer_) ;
glBindRenderbuffer(GL_RENDERBUFFER, colorBuffer_);
if (!colorBuffer_)
{
NSLog(@"glGenRenderbuffers() failed");
break;
}
[self.context renderbufferStorage:GL_RENDERBUFFER fromDrawable:drawable_];
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width_);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &height_);
glGenFramebuffers(1, &fbo);
glBindFramebuffer(GL_FRAMEBUFFER, fbo);
if (!fbo)
{
NSLog(@"glGenFramebuffers() failed");
break;
}
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, self.context, NULL, &textureCache_);
if (err)
{
NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d", err);
}
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
CVPixelBufferCreate(kCFAllocatorDefault,
(int)width_,
(int)height_,
kCVPixelFormatType_32BGRA,
attrs,
&renderTarget_);
err = CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
textureCache_, renderTarget_,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
(int)width_,
(int)height_,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTexture_);
if (err)
{
NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d", err);
}
CFRelease(attrs);
CFRelease(empty);
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture_), CVOpenGLESTextureGetName(renderTexture_));
checkForErrors();
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
NSLog(@"%u", CVOpenGLESTextureGetName(renderTexture_));
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture_), 0);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorBuffer_);
I want to render to texture and render to renderbuffer (to see results of rendering at the screen) at the same time. But this code is not working. I think that I can't use glFramebufferTexture2D
and glFramebufferRenderbuffer
at the same time. Am I right? how can I do it?
Upvotes: 1
Views: 1434
Reputation: 45968
You are right in that you cannot attach both a texture and a renderbuffer to the same attachment point to automatically render into both.
Just render it to the texture and then draw a screen-sized textured quad to the screen to show it. And of course, remember to unbind your texture (glBindTexture(GL_TEXTURE_2D, 0)
) while rendering into it, which at the moment you don't.
Or do it the other way around by rendering the results to the screen as usual and copy those results into the texture using glCopyTexSubImage2D
. But in the end you won't get around a copy, be it indirectly in the form of drawing a textured quad or a direct framebuffer-to-texture copy.
EDIT: You might also solve this using multiple render targets, by attaching the texture and the renderbuffer to different color attachments and putting out the same resulting color on multiple channels in the fragment shader (using gl_FragData[i]
instead of gl_FragColor
). But I'm not sure if that would really buy you anything, plus it requires your shaders to be aware of the double rendering. And in the end I'm not sure ES actually supports mutliple render targets.
Upvotes: 4