Doug
Doug

Reputation: 35206

Why does this glsl shader not respect the depth index?

This is a vertex shader I'm currently working with:

attribute vec3 v_pos;
attribute vec4 v_color;
attribute vec2 v_uv;
attribute vec3 v_rotation; // [angle, x, y]

uniform mat4 modelview_mat;
uniform mat4 projection_mat;

varying vec4 frag_color;
varying vec2 uv_vec;
varying mat4 v_rotationMatrix;

void main (void) {

    float cos = cos(v_rotation[0]);
    float sin = sin(v_rotation[0]);

    mat2 trans_rotate = mat2(
      cos, -sin,
      sin, cos
    );

    vec2 rotated = trans_rotate * vec2(v_pos[0] - v_rotation[1], v_pos[1] - v_rotation[2]);
    gl_Position = projection_mat * modelview_mat * vec4(rotated[0] + v_rotation[1], rotated[1] + v_rotation[2], 1.0, 1.0);
    gl_Position[2] = 1.0 - v_pos[2] / 100.0; // Arbitrary maximum depth for this shader.
    frag_color = vec4(gl_Position[2], 0.0, 1.0, 1.0);  // <----------- !!
    uv_vec = v_uv;
}

and the fragment:

varying vec4 frag_color;
varying vec2 uv_vec;

uniform sampler2D tex;

void main (void){
    vec4 color = texture2D(tex, uv_vec) * frag_color;
    gl_FragColor = color;
}

Notice how I'm manually setting the Z index of the gl_Position variable to be a bound value in the range 0.0 -> 1.0 (the upper bound is done in code; safe to say, not vertex has a z value < 0 or > 100).

It works... mostly. The problem is that when I render it, I get this:

enter image description here

That's not the correct depth sorting for these elements, which have z value respectively, of 15, 50 and 80, as you can see from the red value for each sprite.

The correct order to render in would be blue -> top, purple middle, and pink bottom; but instead these sprites are being rendered in render order.

ie. They are being drawn via:

glDrawArrays() <--- Pink, first geometry batch
glDrawArrays() <--- Blue, second geometry batch
glDrawArrays() <--- Purple, thirds geometry batch

What's going on?

Surely it irrelevant how many times I call gl draw functions before flushing; the depth testing should sort this all out right?

Do you have to manually invoke depth testing inside the fragment shader somehow?

Upvotes: 0

Views: 1928

Answers (1)

Andon M. Coleman
Andon M. Coleman

Reputation: 43359

You say you're normalizing the output Z value into the range: 0.0 - 1.0?

It should really be the range: -W - +W. Given an orthographic projection, this means the clip-space Z should range from: -1.0 - +1.0. You are only using half of your depth range, which reduces the resolving capability of the depth buffer significantly.

To make matters worse (and I am pretty sure this is where your actual problem comes from) it looks like you are inverting your depth buffer by giving the farthest points a value of 0.0 and the nearest points 1.0. In actuality, -1.0 corresponds to the near plane and 1.0 corresponds to the far plane in OpenGL.

gl_Position[2] = 1.0 - v_pos[2] / 100.0;
~~~~~~~~~~~~~~
// This needs to be changed, your depth is completely backwards.


gl_Position.z = 2.0 * (v_pos.z / 100.0) - 1.0;
~~~~~~~~~~~~~
// This should fix both the direction, and use the full depth range.

However, it is worth mentioning that now the value of gl_Position[2] or gl_Position.z ranges from -1.0 to 1.0 which means it cannot be used as a visible color without some scaling and biasing:

frag_color = vec4 (gl_Position.z * 0.5 + 0.5, 0.0, 1.0, 1.0);  // <----------- !!

On a final note, I have been discussing Normalized Device Coordinates this entire time, not window coordinates. In window coordinates the default depth range is 0.0 = near, 1.0 = far; this may have been the source of some confusion. Understand that window coordinates (gl_FragCoord) are not pertinent to the calculations in the vertex shader.

You can use this in your fragment shader to test if your depth range is setup correctly:

vec4 color = texture2D(tex, uv_vec) * vec4 (gl_FragCoord.z, frag_color.yzw);

It should produce the same results as:

vec4 color = texture2D(tex, uv_vec) * frag_color;

Upvotes: 1

Related Questions