Reputation: 4691
I have 3072 values of type std::int16_t
(16 bit integer), which corresponds to a GLshort
. I want to send those to my GLSL Shaders. I read up on Buffer Textures and tried to implement that approach. However, the data does not seem to be intact anymore once it arrives in the Shader. I'm not certain yet, but it looks like the values are all either 0
or maxed out. What am I doing wrong?
My initialization code looks like this (bar some stuff that isn't relevant):
// 1: Get the data - array of GLshort:
GLshort tboData[3072];
for (size_t i = 0; i < 3072; ++i)
{
// cdb.getSprite() returns std::int16_t
tboData[i] = (GLshort) cbd.getSprite(i);
}
// 2: Make sure the Shader Program is being used:
sp->use(); // sp is a wrapper class for GL Shader Programs
// 3: Generate the GL_TEXTURE_BUFFER, bind it and send the data:
GLuint tbo;
glGenBuffers(1, &tbo);
glBindBuffer(GL_TEXTURE_BUFFER, tbo);
glBufferData(GL_TEXTURE_BUFFER, sizeof(tboData), tboData, GL_STATIC_DRAW);
// 4: Generate the Buffer Texture, activate and bind it:
GLuint tboTex;
glGenTextures(1, &tboTex);
glActiveTexture(GL_TEXTURE1); // GL_TEXTURE0 is a spritesheet
glBindTexture(GL_TEXTURE_BUFFER, tboTex);
// 5: Associate them using `GL_R16` to match the 16 bit integer array:
glTexBuffer(GL_TEXTURE_BUFFER, GL_R16, tbo);
// 6: Make the connection within the Shader:
glUniform1i(sp->getUniformLocation("tbo"), 1);
glBindBuffer(GL_TEXTURE_BUFFER, 0);
At the beginning of my rendering loop, I make sure everything is bound:
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, spriteSheet);
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_BUFFER, tboTex);
In my rendering loop, I set a uniform that will be used to index the TBO:
glUniform1i(sp->getUniformLocation("tboIndex"), tboIndex);
Now, the Vertex Shader's important pieces:
#version 140
// other ins, outs, uniforms ...
uniform int tboIndex;
uniform isamplerBuffer tbo;
out float pass_Sprite; // Frag shader crashes when this is int, why?
void main()
{
// gl_Position, matrices, etc ...
pass_Sprite = texelFetch(tbo, tboIndex).r;
}
Any advice is welcome.
Upvotes: 0
Views: 3166
Reputation: 473407
GL_R16
That is an unsigned normalized integer format. Your buffer sampler expects signed, non-normalized integers.
The correct image format would be GL_R16I
.
Frag shader crashes when this is int, why
Probably because you didn't use flat
interpolation. Interpolation doesn't work for non-floating-point values.
And it wouldn't crash; it would simply fail to compile. You should always check compile errors in your shaders.
Upvotes: 7