Nicolas Lefebvre
Nicolas Lefebvre

Reputation: 4282

simple procedural skybox

As part of an attempt to generate a very simple looking sky, I've created a skybox (basically a cube going from (-1, -1, -1) to (1, 1, 1), which is drawn after all of my geometry and forced to the back via the following simple vertex shader :

#version 330
layout(location = 0) in vec4 position;
layout(location = 1) in vec4 normal;

out Data
{
    vec4 eyespace_position;
    vec4 eyespace_normal;
    vec4 worldspace_position;
    vec4 raw_position;
} vtx_data;

uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;

void main()
{
    mat4 view_without_translation = view;
    view_without_translation[3][0] = 0.0f;
    view_without_translation[3][1] = 0.0f;
    view_without_translation[3][2] = 0.0f;

    vtx_data.raw_position = position;
    vtx_data.worldspace_position = model * position;
    vtx_data.eyespace_position =  view_without_translation * vtx_data.worldspace_position;

    gl_Position = (projection * vtx_data.eyespace_position).xyww;
}

From this, I'm trying to have my sky display as a very simple gradient from a deep blue at the top to a lighter blue at the horizon.

Obviously, simply mixing my two colors based on the Y coordinate of each fragment is going to look very bad : the fact that you're looking at a box and not a dome is immediately clear, as seen here :

wrong skybox

Note the fairly visible "corners" at the top left and top right of the box.

Instinctively, I was thinking that the obvious fix would be to normalize the position of each fragment, to get a position on a unit sphere, then take the Y coordinate of that. I thought that would result in a value that would be constant for a given "altitude", if that makes sense. Like this :

#version 330
in Data
{
    vec4 eyespace_position;
    vec4 eyespace_normal;
    vec4 worldspace_position;
    vec4 raw_position;
} vtx_data;

out vec4 outputColor;

const vec4 skytop = vec4(0.0f, 0.0f, 1.0f, 1.0f);
const vec4 skyhorizon = vec4(0.3294f, 0.92157f, 1.0f, 1.0f);

void main()
{  
    vec4 pointOnSphere = normalize(vtx_data.worldspace_position);
    float a = pointOnSphere.y;
    outputColor = mix(skyhorizon, skytop, a);
}

The result however is much the same as the first screenshot (I can post it if necessary but since it's visually similar to the first, I'm skipping it to shorten this question right now).

After some random fiddling (cargo cult programming, I know :/), I realized that this works :

void main()
{  
    vec3 pointOnSphere = normalize(vtx_data.worldspace_position.xyz);
    float a = pointOnSphere.y;
    outputColor = mix(skyhorizon, skytop, a);
}

The only difference is that I normalize the position without it's W component.

And here's the working result : (the difference is subtle in screenshots but quite noticeable in motion) correct skybox

So, finally, my question : why does this work when the previous version fails ? I must be misunderstanding something extremely basic about homogenous coordinates but my brain just isn't clicking right now !

Upvotes: 7

Views: 3071

Answers (1)

Stefan Hanke
Stefan Hanke

Reputation: 3518

GLSL normalize does not handle homogeneous coordinates per se. It interprets the coordinate as belonging to R^4. This is in general not what you want. However, if vtx_data.worldspace_position.w == 0, then the normalize should produce the same result.

I don't know what vec3 pointOnSphere = normalize(vtx_data.worldspace_position); means because the left side should have type vec4 also.

Upvotes: 4

Related Questions