Reputation: 41
I've been working on this issue for a while and it's time to ask the greater community for help. I have read many other StackOverflow questions on this topic and have not yet found a relevant solution.
I have a well established Android OpenGL project that renders to a texture, before it then renders that texture to the screen. This mechanism is fundamental to my application and I have a lot of history and confidence in it. I recently added new functionality to internally take a screenshot of the rendering; that is to say, my application is able to save the rendered texture to a file as well. These images have traditionally been exactly the size of the display.
Now, I want to generate images that are larger than the screen size, so that the screenshots generated reflect the larger image size, but are also scaled to the screen size when displayed on screen. This should be a straightforward and easy process, however, I am getting unexpected results. The resulting screenshot is the correct size, but is empty except for an area the size of the screen. For example, if the rendered texture and resulting screenshot is intended to be 4 times the screen display size (twice the size of the screen for each dimension X and Y), the screenshot image file will be that intended size, but only the upper left quadrant of the image will have been drawn. In this example, here is the resulting generated screenshot. My viewport is 768x887 and the resulting screenshot is correctly 1536x1774 and within the screenshot, the only colored area is 768x887. For our purposes here, my fragment shader for rendering to texture is a test of the coordinate mapping to the screen...
gl_FragColor = vec4(uv.x, 0.0, uv.y, 1.0); // during render to texture
Note that when we draw this same texture to the screen during execution, the full screen is colored consistent with that shader. Why is only one quadrant of the screenshot filled, instead of the whole thing? And why, when this texture is drawn on screen, does it display only the part that's the size of the screen, rather than the whole thing with the three empty quadrants?
I get the original size of the viewport from GLSurfaceView.Renderer.onSurfaceChanged()
and store it into _viewportWidth
and _viewportHeight
. When I create the frame buffer texture, I traditionally created it directly from _viewportWidth
and _viewportHeight
. Now, I have, as an example...
float quality = 2f;
_frameBufferWidth = (int)((float)_viewportWidth * quality);
_frameBufferHeight = (int)((float)_viewportHeight * quality);
... and generate the frame buffer of size _frameBufferWidth
by _frameBufferHeight
.
I am also calling glViewport()
twice. After my first call to glBindframebuffer()
to render to the texture and not the screen, and after doing relevant error handling, I call glViewport(0, 0, _frameBufferWidth, _frameBufferHeight)
, which passes without error. When I later want to draw this texture to the screen, I make my second glBindframebuffer()
call, and immediately after, call glViewport(0, 0, _viewportWidth, _viewportHeight)
. The idea is, the original render to texture is going into a _frameBufferWidth
by _frameBufferHeight
sized image and when we present it on screen, we want a _viewportWidth
by _viewportHeight
size.
Any ideas what I may be missing? Thanks in advance.
EDIT (March 10, 2016):
I just tried quality=0.5f
and am getting unusual results. I would prefer to share more images to clarify this scenario, but I'm a new member and am only allowed two. When we draw to the screen with quality=0.5f
, the screen is colored properly according to the GLSL code above: the display is identical to the 768x887 upper left quadrant of the screenshot linked above (corresponding to quality=2f
). The quality=0.5f
screenshot that is generated, however, is colored differently from the screen. This screenshot correctly has the intended 384x443 size, but is still being rendered as though it's 768x887 and just cropping out a 384x443 part.
Even though the code suggests otherwise, it seems as though we're always rendering to a _viewportWidth
by _viewportHeight
area, rather than the intended _frameBufferWidth
by _frameBufferHeight
area.
I have basically a full screen quad for both rendering passes and am used to that working OK. When I render to the screen, I sample the texture I just rendered to:
gl_FragColor = texture2D(u_sampler, uv); // during render to screen
The u_sampler
accesses the texture we rendered to and uv
is in [0,1] for both dimensions. So, for the screen to show anything, it must be doing a texture lookup to get its color information. Thus, the bright red and blue shown on the screen must exist in the framebuffer originally, even though it's missing from the correctly sized screenshot.
Upvotes: 4
Views: 4127
Reputation: 167
I met the same issue with before, as on iOS, also with openGL ES. I tried to render to a 4096*4096's framebuffer and it's on the top left corner.
The solution I found is to add
glViewport(0, 0, 4096, 4096)
before any other function such as
glClearColor(0, 0, 0, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
in the main render function.
I was able to render it later with
glViewport(0, 0, view.bounds.width*3, view.bounds.height*3);
as glViewport will translate the normalized coordinate to the pixel coordinate.
Another thing to mention is that, on iOS view's size is not in pixel but in point, so I need to multiply it by 3. You can check out if you get the actual pixel coordinate on Android.
Upvotes: 2