mchiasson
mchiasson

Reputation: 2602

Diffuse lighting error on parallel surfaces

As a test, I created a simple quad. Here are its attributes:

Vertex vertices[] =
{
    //    Positions        Normals
    {vec3(-1,-1, 0), vec3(-1,-1, 1)}, // v0
    {vec3( 1,-1, 0), vec3( 1,-1, 1)}, // v1
    {vec3(-1, 1, 0), vec3(-1, 1, 1)}, // v2
    {vec3( 1, 1, 0), vec3( 1, 1, 1)}, // v3
};

And I put it in my world space at (0.0, 0.0, -9.5). Then I put my point light position at (0.0, 0.0, -8.0). My camera is at the origin (0.0, 0.0, 0.0). When I run my program, this works as expected:

Diffuse Lighting on a single quad

But then, when I replace this quad with 9 scaled down quads, put them all at -9.5 on Z (in other word, they are all parallel to each other on Z), my diffuse lighting gets a little weird

Diffuse Lighting on nine quads

It looks like the corners are showing too much lighting, breaking the nice diffuse circle that we see on a regular quad.

Here is my shader program:

precision mediump int;
precision mediump float;

varying vec3 v_position;
varying vec3 v_normal;

#if defined(VERTEX)
uniform mat4 u_mvpMatrix;
uniform mat4 u_mvMatrix;
uniform mat3 u_normalMatrix;

attribute vec4 a_position;
attribute vec3 a_normal;

void main()
{
    vec4 position = u_mvMatrix * a_position;
    v_position    = position.xyz / position.w;
    v_normal      = normalize(u_normalMatrix * a_normal);

    gl_Position = u_mvpMatrix * a_position;
}
#endif // VERTEX

#if defined(FRAGMENT)
uniform vec3  u_pointLightPosition;

void main()"
{
    vec3 viewDir                = normalize(-v_position);
    vec3 normal                 = normalize(v_normal);
    vec3 lightPosition          = u_pointLightPosition - v_position;
    vec3 pointLightDir          = normalize(lightPosition);
    float distance              = length(lightPosition);
    float pointLightAttenuation = 1.0 / (1.0 + (0.25 * distance * distance));
    float diffuseTerm           = max(dot(pointLightDir, normal), 0.15);

    gl_FragColor = vec4(diffuseTerm * pointLightAttenuation);
}
#endif // FRAGMENT

My uniforms are uploaded as followed (I'm using GLM):

const mat4 &view_matrix = getViewMatrix();
mat4 mv_matrix          = view * getModelMatrix();
mat4 mvp_matrix         = getProjectionMatrix() * mv_matrix;
mat3 normal_matrix      = inverseTranspose(mat3(mv_matrix));
vec3 pointLightPos      = vec3(view_matrix * vec4(getPointLightPos(), 1.0f));

glUniformMatrix4fv(   mvpMatrixUniformID, 1, GL_FALSE, (GLfloat*)&mvp_matrix);
glUniformMatrix4fv(    vpMatrixUniformID, 1, GL_FALSE, (GLfloat*)&mv_matrix);
glUniformMatrix3fv(normalMatrixUniformID, 1, GL_FALSE, (GLfloat*)&normal_matrix);
glUniform3f(pointLightPosUniformID, pointLightPos.x, pointLightPos.y, pointLightPos.z);

Am I doing anything wrong?

Thanks!

Upvotes: 0

Views: 88

Answers (1)

jozxyqk
jozxyqk

Reputation: 17266

Without going too much into your code, I think everything is working just fine. I see a very similar result with a quick blender setup:

enter image description here

The issue is the interpolation of the normal doesn't produce a spherical bump.

It ends up being a patch like this (I simply subdivided a smooth shaded cube)...

enter image description here

If you want a more spherical bump, you could generate the normals implicitly in a fragment shader (for example as is done here (bottom image)), use a normal map, or use more tessellated geometry such as an actual sphere.

Upvotes: 1

Related Questions