Reputation: 21057
I want to display an image fullscreen in my OpenGL app without losing its aspect ratio. I know that I can draw an image as a texture onto a "cube" or 2d plane. But I'm not sure if this is really the best way when I just simply want to show a 2d image.
Espcially because I want this image to be fullscreen without losing its aspect ratio. I know this is easy with an ImageView. But I need this in my OpenGL ES application.
But I have no clue how to do this. Anyone any idea?
Upvotes: 2
Views: 3028
Reputation: 54632
Since the aspect ratio of your image and the aspect ratio of your view will generally be different, there are two cases. With aspect ratio defined as width/height:
Without any transformations applied, OpenGL uses a coordinate system that has a range of [-1.0, 1.0] in both the x- and y-direction. So if you draw a quad covering [-1.0, 1.0] in both directions, it will fill the entire view. Since we do not want to fill the entire range in one of the two directions, we need to scale down the y-coordinates in case 1, and the x-coordinates in case 2. The amount of scaling needed corresponds to the ratio between the two aspect ratios.
With the following values:
float imgAspectRatio = (float)imgWidth / (float)imgHeight;
float viewAspectRatio = (float)viewWidth / (float)viewHeight;
We calculate the scaling factors for x and y as:
float xScale = 1.0f;
float yScale = 1.0f;
if (imgAspectRatio > viewAspectRatio) {
yScale = viewAspectRatio / imgAspectRatio;
} else {
xScale = imgAspectRatio / viewAspectRatio;
}
Now all that's left is to apply these scale factors while rendering. There's multiple ways of doing that. You could use them for your input coordinates, and draw a quad that spans a range of [-xScale, xScale] in x-direction and [-yScale, yScale] in y-direction. Or you can apply the scaling in your shader, which I think is slightly more elegant. In this case, you still draw a quad with extent [-1.0, 1.0] in both directions, and use a vertex shader that could look like this:
#version 100
uniform vec2 ScaleFact;
attribute vec2 Position;
varying vec2 TexCoord;
void main() {
gl_Position = vec4(ScaleFact * Position, 0.0, 1.0);
TexCoord = 0.5 * Position + 0.5;
}
And a fragment shader that simply samples the texture:
#version 100
precision mediump float;
uniform sampler2D Tex;
varying vec2 TexCoord;
void main() {
gl_FragColor = texture2D(Tex, TexCoord);
}
You pass in the xScale/yScale
values we calculated above for the value of the ScaleFact
uniform, set up the attribute location and everything else the way you normally would, and then render the [-1.0, 1.0] x [-1.0, 1.0] quad.
The advantage of doing the scaling in the shader is that you can still easily derive position and texture coordinates from a single input attribute. If you pass in the scaled positions, you would probably want a separate attribute to pass in the texture coordinates. Which is a perfectly fine thing to do as well.
Upvotes: 6