Kovasandra
Kovasandra

Reputation: 555

glReadPixels returns zeroes with multi-sampling

I am writing OpenGL app for iOS, and I need to take in-app screenshot of rendered scene. All is working ok when I am not using multi-sampling. But when I turn multi-sampling on, glReadPixels does not return the correct data (scene is drawn correctly - graphics quality is much better with multi-sampling).

I already checked bunch of similar questions at SO, and on some other places, but none of them solve my problem, since I am already doing it on proposed ways:

  1. I am taking screenshot after buffers are resolved, but before render buffer is presented.
  2. glReadPixels does not return error.
  3. I tried even to set kEAGLDrawablePropertyRetainedBacking to YES and to take screenshot after buffer is presented - does not work either.
  4. I support OpenGLES 1.x rendering API (context initialised with kEAGLRenderingAPIOpenGLES1)

Basically I am out of ideas what can be wrong. Posting question on SO is my last resort.

This is the relevant source code:

Creating frame buffers

- (BOOL)createFramebuffer
{

    glGenFramebuffersOES(1, &viewFramebuffer);
    glGenRenderbuffersOES(1, &viewRenderbuffer);

    glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
    [context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);

    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

    // Multisample support

    glGenFramebuffersOES(1, &sampleFramebuffer);
    glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer);

    glGenRenderbuffersOES(1, &sampleColorRenderbuffer);
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
    glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_RGBA8_OES, backingWidth, backingHeight);
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, sampleColorRenderbuffer);

    glGenRenderbuffersOES(1, &sampleDepthRenderbuffer);
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);
    glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);

    // End of multisample support

    if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES) {
        NSLog(@"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
        return NO;
    }

    return YES;
}

Resolving buffers part and taking snapshot

    glBindFramebufferOES(GL_DRAW_FRAMEBUFFER_APPLE, viewFramebuffer);
    glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, sampleFramebuffer);
    glResolveMultisampleFramebufferAPPLE();
    [self checkGlError];

    //glFinish();

    if (capture)
        captureImage = [self snapshot:self];    

    const GLenum discards[]  = {GL_COLOR_ATTACHMENT0_OES,GL_DEPTH_ATTACHMENT_OES};
    glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE,2,discards);

    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);    

    [context presentRenderbuffer:GL_RENDERBUFFER_OES];    

Snapshot method (basically copied from apple docs)

- (UIImage*)snapshot:(UIView*)eaglview
{

    // Bind the color renderbuffer used to render the OpenGL ES view
    // If your application only creates a single color renderbuffer which is already bound at this point,
    // this call is redundant, but it is needed if you're dealing with multiple renderbuffers.
    // Note, replace "_colorRenderbuffer" with the actual name of the renderbuffer object defined in your class.    
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);


    NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight;
    NSInteger dataLength = width * height * 4;
    GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));

    // Read pixel data from the framebuffer
    glPixelStorei(GL_PACK_ALIGNMENT, 4);
    [self checkGlError];
    glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);
    [self checkGlError];

    // Create a CGImage with the pixel data
    // If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
    // otherwise, use kCGImageAlphaPremultipliedLast
    CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
    CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
    CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
                                ref, NULL, true, kCGRenderingIntentDefault);

    // OpenGL ES measures data in PIXELS
    // Create a graphics context with the target size measured in POINTS
    NSInteger widthInPoints, heightInPoints;
    if (NULL != UIGraphicsBeginImageContextWithOptions) {
        // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
        // Set the scale parameter to your OpenGL ES view's contentScaleFactor
        // so that you get a high-resolution snapshot when its value is greater than 1.0
        CGFloat scale = eaglview.contentScaleFactor;
        widthInPoints = width / scale;
        heightInPoints = height / scale;
        UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
    }
    else {
        // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
        widthInPoints = width;
        heightInPoints = height;
        UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
    }

    CGContextRef cgcontext = UIGraphicsGetCurrentContext();

    // UIKit coordinate system is upside down to GL/Quartz coordinate system
    // Flip the CGImage by rendering it to the flipped bitmap context
    // The size of the destination area is measured in POINTS
    CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
    CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);

    // Retrieve the UIImage from the current context
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

    UIGraphicsEndImageContext();

    // Clean up
    free(data);
    CFRelease(ref);
    CFRelease(colorspace);
    CGImageRelease(iref);

    return image;
}

Upvotes: 2

Views: 986

Answers (1)

Christian Rau
Christian Rau

Reputation: 45948

You resolve the multisample buffers as usual by doing a glResolveMultisampleFramebufferAPPLE after binding the viewFramebuffer as draw framebuffer and the sampleFramebuffer as read framebuffer. But did you also remebember to bind the viewFramebuffer as read framebuffer (glBindFramebuffer(GL_READ_FRAMEBUFFER, viewFramebuffer)) then before the glReadPixels? glReadPixels will always read from the currently bound read framebuffer and if you didn't change this binding after the multisample resolve, this will still be the multisample framebuffer and not the default one.

I also found your glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer)-calls quite irritating, because that doesn't really do anything meaningful, the currently bound renderbuffer is only relevant for functions working on renderbuffers (practically only glRenderbufferStorage) (but it may also be that ES does something meaningful with it and binding it is required for [context presentRenderbuffer:GL_RENDERBUFFER_OES] to work). But nevertheless maybe you thought that this binding also controls the buffer that glReadPixels will read from, but this is not the case, it will always read from the current framebuffer bound to GL_READ_FRAMEBUFFER.

Upvotes: 2

Related Questions