Unome
Unome

Reputation: 6900

How to get a ByteBuffer that will work with LWJGL?

I'm trying to call the glTexImage2D function using the OpenGL library. I'm using the LWJGL as the framework to use OpenGL in Java.

According to the documentation, this method accepts the following parameters:

public static void glTexImage2D(int target,
            int level,
            int internalformat,
            int width,
            int height,
            int border,
            int format,
            int type,
            java.nio.ByteBuffer pixels)

My implementation of this is below.

GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0, GL11.GL_RGB, 1092, 1092, 0, GL11.GL_RGB, GL11.GL_INT, imageData);

However, I am getting an error:

Exception in thread "main" java.lang.IllegalArgumentException: Number of remaining buffer elements is 3577392, must be at least 14309568. Because at most 14309568 elements can be returned, a buffer with at least 14309568 elements is required, regardless of actual returned element count
    at org.lwjgl.BufferChecks.throwBufferSizeException(BufferChecks.java:162)
    at org.lwjgl.BufferChecks.checkBufferSize(BufferChecks.java:189)
    at org.lwjgl.BufferChecks.checkBuffer(BufferChecks.java:230)
    at org.lwjgl.opengl.GL11.glTexImage2D(GL11.java:2855)
    at TextureLab.testTexture(TextureLab.java:100)
    at TextureLab.start(TextureLab.java:39)
    at TextureLab.main(TextureLab.java:20)

I've done allot of querying, and I assume my method of creating a ByteBuffer for the last parameter is what is causing the issue.

My code implementation for getting a ByteBuffer form an is as follows:

img = ImageIO.read(file);
byte[] pixels = ((DataBufferByte) img.getRaster().getDataBuffer()).getData();

ByteBuffer buffer = BufferUtils.createByteBuffer(pixels.length);
buffer.put(pixels);
buffer.flip();
buffer.rewind();

I've substituted the length of the buffer for the width*height*4 and even hardcoded to the number contained in the error, all to no luck. Any ideas what I'm doing wrong? I think the issue is in my ByteBuffer, but even that I'm not sure of.

Upvotes: 1

Views: 1307

Answers (1)

derhass
derhass

Reputation: 45362

The lwjgl layer is telling you that your buffer should be at least 14309568 bytes big, but you provide only 3577392. The reason for this is that you used GL_INT as the format parameter of the glTexImage2D call, so each pixel is assumed to be represented by 3 4-byte integer values bz the GL.

You just want to use GL_UNSIGNED_BYTE for typical 8 bit per channel image content, which exaclty maps to the 3577392 bytes you are currently providing.

Upvotes: 6

Related Questions