Reputation: 1262
I'm currently working to convert a project from using a texture atlas to an array texture, but for the life of me I can't get it working.
Some notes about my environment:
Problems I believe I've ruled out:
Here's my code for creating the array texture:
public void createTextureArray() {
glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D_ARRAY, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
int handle = glGenTextures();
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D_ARRAY, handle);
glPixelStorei(GL_UNPACK_ROW_LENGTH, Texture.SIZE);
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glTexStorage3D(GL_TEXTURE_2D_ARRAY, 1, GL_RGBA8, Texture.SIZE, Texture.SIZE, textures.size());
try {
int layer = 0;
for (Texture tex : textures.values()) {
// Next few lines are just for loading the texture. They've been ruled out as the issue.
PNGDecoder decoder = new PNGDecoder(ImageHelper.asInputStream(tex.getImage()));
ByteBuffer buffer = BufferUtils.createByteBuffer(decoder.getWidth() * decoder.getHeight() * 4);
decoder.decode(buffer, decoder.getWidth() * 4, PNGDecoder.Format.RGBA);
buffer.flip();
glTexSubImage3D(GL_TEXTURE_2D_ARRAY, 0, 0, 0, layer, decoder.getWidth(), decoder.getHeight(), 1,
GL_RGBA, GL_UNSIGNED_BYTE, buffer);
tex.setLayer(layer);
layer++;
}
} catch (IOException ex) {
ex.printStackTrace();
System.err.println("Failed to create/load texture array");
System.exit(-1);
}
}
The code for creating the VAO/VBO:
private static int prepareVbo(int handle, FloatBuffer vbo) {
IntBuffer vaoHandle = BufferUtils.createIntBuffer(1);
glGenVertexArrays(vaoHandle);
glBindVertexArray(vaoHandle.get());
glBindBuffer(GL_ARRAY_BUFFER, handle);
glBufferData(GL_ARRAY_BUFFER, vbo, GL_STATIC_DRAW);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D_ARRAY, GraphicsMain.TEXTURE_REGISTRY.atlasHandle);
glEnableVertexAttribArray(positionAttrIndex);
glEnableVertexAttribArray(texCoordAttrIndex);
glVertexAttribPointer(positionAttrIndex, 3, GL_FLOAT, false, 24, 0);
glVertexAttribPointer(texCoordAttrIndex, 3, GL_FLOAT, false, 24, 12);
glBindVertexArray(0);
vaoHandle.rewind();
return vaoHandle.get();
}
Fragment shader:
#version 330 core
uniform sampler2DArray texArray;
varying vec3 texCoord;
void main() {
gl_FragColor = texture(texArray, texCoord);
}
(texCoord
is working fine; it's being passed from the vertex shader correctly.)
I'm about out of ideas, so being still somewhat new to modern OpenGL, I'd like to know if there's anything glaringly wrong with my code.
Upvotes: 0
Views: 648
Reputation: 7190
Some considerations:
glTexParameteri
will affect what is bound to GL_TEXTURE_2D_ARRAY
at that moment, so you better want to move them after glBindTexture
glGenTextures()
? If you have the possibility for a more specific method, please use itGL_UNPACK_ROW_LENGTH
if greater than 0, defines the number of pixels in a row. I suppose then Texture.SIZE
is not really the texture size but the dimension on one side (128 in your case). Anyway you don't need to set that, you can skip itGL_UNPACK_ALIGNMENT
to 4 only if your row lenght is a multiple of it. Most of time people set it to 1 before loading a texture to avoid any trouble and then set it back to 4 once doneglTexStorage3D
is expected to be the number of layers, I hope textures.size()
better returns that rather than the size (128x128)glActiveTexture
and glBindTexture
inside prepareVbo
are useless, they are not part of the vaovarying
in glsl, it's deprecated, switch to a simple in out
glGetError()
, some silent errors may not be seen explicitely by the rendering outputprepareVbo
but you do initialize in it both vao and vboUpvotes: 2