Reputation: 1
I am trying to load a 1D texture of unsigned integers to the shader. Each element of the texture can only be 1 or 0 (I need to do this), and that value depends on the choice the user does in the UI of my app. I am absolutely stuck on this apparently easy issue, I cannot get what I am doing wrong. This is my OpenGL code:
glGenTextures(1, &m_inside_texture_id);
vtkgl::ActiveTexture(vtkgl::TEXTURE16);
glBindTexture(GL_TEXTURE_1D, m_inside_texture_id);
glTexImage1D(GL_TEXTURE_1D,0,vtkgl::R8UI,insideVector.size(),
0,vtkgl::R8UI,GL_UNSIGNED_BYTE, &insideVector[0]);
glBindTexture(GL_TEXTURE_1D, m_inside_texture_id);
glTexParameterf( GL_TEXTURE_1D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
glTexParameterf( GL_TEXTURE_1D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );
glTexParameterf( GL_TEXTURE_1D, GL_TEXTURE_WRAP_S, vtkgl::CLAMP_TO_EDGE );
glBindTexture(GL_TEXTURE_1D, 0);
And this is the GLSL code I am using to fetch that texture:
#version 120
#extension GL_EXT_gpu_shader4 : enable
uniform usampler1D insideVector;
uint inside2 = texture1D(insideVector, indexInVector).a;
if (inside2 == 0)
{
gl_FragColor = shade(vec4(1.0, 0.0, 0.0, 1.0));
}
else if (inside2 == 1)
{
gl_FragColor = shade(vec4(0.0, 1.0, 0.0, 1.0));
}
insideVector is a vector of GLubytes. indexInVector is a normalized unsigned integer which grows each time the user changes is choice (each time: insideVector.push_back(1) or insideVector.push_back(0) and indexInVector=insideVector.size()- (1/255))
It compiles, but crashes when trying to use the inside2 variable. The OpenGL version I am using is 2.1 with lot of extensions. Thanks in advance.
EDIT: The uniform var insideVector is set as follow:
vtkUniformVariables *v;
ivalue = 16;
v->SetUniformi("insideVector", 1, &ivalue);
SOLVED The issue was in the shader. inside2 is an unsigned integer, so it must be compared to unsigned integers (0u and 1u) instead of signed integers (0 and 1).
Upvotes: 0
Views: 887
Reputation: 1216
Likely unrelated to the crash but "R" in "R8UI" means red, so in your shader you should replace
uint inside2 = texture1D(insideVector, indexInVector).a;
with
uint inside2 = texture1D(insideVector, indexInVector).r;
also, the glTexImage1D call seems wrong: the format parameter should be GL_RED_INTEGER; GL_R8UI is for internal format.
Upvotes: 1