CamoRanger12
CamoRanger12

Reputation: 29

GLSL converting uint to float for color

For my program I want to pass data into the vertex shader as unsigned bytes and then convert the bytes to a float value (between 0.0 and 1.0) for color in the vertex shader with something like this:

#version 330 core
layout (location = 0) in uvec3 aPos;
layout (location = 1) in uvec4 aColor;

out VS_OUT {
    vec4 color;
} vs_out;

void main()
{
    vs_out.color = vec4(float(aColor.r) / 255.0f, float(aColor.g) / 255.0f, float(aColor.b) / 255.0f, 1.0f); //alpha set to 1 temporarily
    gl_Position = vec4(aPos, 1.0);
}

however it seems no matter how I do the constructors or operations it will always convert to 1.0 when the inputed byte values are anything except 0 (zero resulting in 0.0). How do I fix this so I will get the correct color values between 0.0 and 1.0? (or is this a bug with openGL 3.3?)

code to pass data in:

unsigned char test[] = {
        0, 0, 0, 255, 1, 10, 255, 0, 255 //first 3 is position and next 4 is color but still outputs white
    };
    unsigned int VBO, VAO;
    glGenBuffers(1, &VBO);
    glGenVertexArrays(1, &VAO);
    glBindVertexArray(VAO);
    glBindBuffer(GL_ARRAY_BUFFER, VBO);
    glBufferData(GL_ARRAY_BUFFER, sizeof(test), &test, GL_STATIC_DRAW);
    glEnableVertexAttribArray(0);
    glVertexAttribPointer(0, 3, GL_UNSIGNED_BYTE, GL_FALSE, 9*sizeof(byte), 0);
    glEnableVertexAttribArray(1);
    glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_FALSE, 9 * sizeof(byte), (void*)(3 * sizeof(BYTE)));
    glBindVertexArray(0);
...
// draw points
        ourShader.use();// use pre-defined shaders
        glBindVertexArray(VAO);
        glDrawArrays(GL_POINTS, 0, 1);

Upvotes: 0

Views: 3588

Answers (2)

CamoRanger12
CamoRanger12

Reputation: 29

I figured it out. I can just put the in values in the shader as a vec4 instead of uvec4 and set the vertex attribute to normalize and the program will convert it automatically:

vertex shader:

#version 330 core
layout (location = 0) in vec3 aPos;
layout (location = 1) in vec4 aColor;

out VS_OUT {
    vec4 color;
} vs_out;

void main()
{
    vs_out.color = aColor;
    gl_Position = vec4(aPos, 1.0); 
}

program code:

byte test[] = {//unsigned char
        0, 0, 0, 25, 100, 136, 255, 0, 255 // ouputs teal color (index 3-6)
    };
...

    glEnableVertexAttribArray(1);
    glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, (9 * sizeof(byte)), (void*)(3 * sizeof(BYTE)));

Upvotes: 2

Ripi2
Ripi2

Reputation: 7198

If you want to read integers in the shader then use glVertexAttribIPointer instead of glVertexAttribPointer (notice the 'I')

BTW, your buffer has 9 values, but only 7 are currently used. If you plan to interleave more values (pos, color, pos, color, etc) then review the "stride" parameter; it should be 7 instead of 9. Well, I don't think you want to waste 2 x numOfVert unused bytes just to keep that stride=9.

Upvotes: 1

Related Questions