Reputation: 734
I am trying to implement some simple 3D functionality into a game I am currently developing. I am utilizing the LibGDX Game Engine to accomplish this goal, as it seemed to integrate nicely with OpenGL ES, however have run into a problem stemming from my inability to figure out how to create a simple Buffer
object from an image in Java.
The goal for me is to have a cube spinning around in 3D space, with some textures on it. The problem is that, although I managed to render the cube and have it spin, I cannot seem to find any tutorials, code examples, or documentation on how to actually decode an image to a Buffer
in order to texture the cube. I have found quite a few examples illustrating how to do this in Android, however this limits you to having to use the BitmapFactory.decodeResource(<your_image here>)
method in the Android SDK, which is not an option for me as I am trying to allow my code to be portable to many platforms.
It is for this reason that I have came to the conclusion that I must somehow find a way to decode an image (a .bmp
file) into a Buffer
in order to leverage the Gdx.gl10.glTexImage2D()
method that I found packaged with OpenGL ES.
I have pursued numerous alternatives to try find a solution, some which include:
Texture.bind()
method embedded in the Texture
class that comes with the LibGDX package.Here is some of the code that I am using to render the Cube. Lots of the OpenGL "switch-flipping" is omitted for the sake of clarity, so if you think I have missed something, please post a comment and I will confirm if it is simply in another part of another class.
I know that this code will be rather stifling to sift through, so I have done my best to comment it as necessary.
Note that this code represents my current implementation, and is the code associated with the "Debugging Information" section below.
public void draw(){
// This is used to hold the OpenGL pointer to the texture
int[] textures = new int[1];
// Tells the GPU to render the vertices in a clockwise manner
Gdx.gl10.glFrontFace(GL10.GL_CW);
// Informs OpenGL of the location of the vertices
Gdx.gl10.glVertexPointer(3, GL10.GL_FLOAT, 0, mVertexBuffer);
// Bind the texture pointer in OpenGL to the texture
Gdx.gl10.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
// This is the function I found in LibGDX that I think I am supposed to use?
mTexture.bind();
// Some "switch-flipping" in OpenGL
Gdx.gl10.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER,
GL10.GL_NEAREST);
Gdx.gl10.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_MAG_FILTER,
GL10.GL_LINEAR);
Gdx.gl10.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S,
GL10.GL_CLAMP_TO_EDGE);
Gdx.gl10.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T,
GL10.GL_CLAMP_TO_EDGE);
Gdx.gl10.glTexEnvf(GL10.GL_TEXTURE_ENV, GL10.GL_TEXTURE_ENV_MODE,
GL10.GL_REPLACE);
// This tells OpenGL to assign my Texture to the 3D Image
Gdx.gl10.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGB, 256, 256, 0, GL10.GL_RGB, GL10.GL_UNSIGNED_BYTE, mTexture);
// Finally, this draws the objects to the screen
Gdx.gl.glDrawElements(GL10.GL_TRIANGLES, 6, GL10.GL_UNSIGNED_BYTE, mIndexBuffer);
}
How do you decode an image directly into a Buffer
in Java? If this cannot be done, then how should I be going about trying to texture my 3D objects in LibGDX? Are there any other resources I should consult?
Last Updated: 19 Mar 2013 12:45PM EST
Here is some information related to the current debugging process of my application. I will update this as I progress through this issue:
Invalid memory access of location 0x0 rip=0x11dc77564
, perhaps this implementation is the correct one, and I am just missing some syntax elsewhere?Mesh
object would be the correct route here. I will try to pursue this implementation and update regarding my results! I was unable to locate this before as it is considered "deprecated" by the LibGDX team, so I am unsure if this will be replaced soon.Some things that should be taken into consideration when answering my question are:
It should be noted that I am doing this using the LibGDX Game Engine, and for that reason, I will also be posting on their forums regarding this issue.
Further, there seem to be many ways to get a BufferedImage
from the Image, however there is then no way to extract a Buffer
from that BufferedImage
. Am I simply overlooking something glaringly obvious here? Are BufferedImage
s really all I should need?
I would prefer not to just include the Android SDK in my build path and use their simple BitmapFactory.decodeResource(<your_image_here>)
method.
I am currently trying to implement this using OpenGL ES 1.0 for compatibility reasons.
Upvotes: 4
Views: 591
Reputation: 25177
I think you want to avoid the raw OpenGL texture and buffer APIs, and stick to the Libgdx Mesh
and Texture
wrappers. They hide a bunch of the OpenGL details. Of course, if you're interested in learning about the OpenGL details, this won't really help. (But perhaps you can get things working with the Libgdx APIs, and then look through their source to see what is really happening).
The first step is to make sure your Mesh
(or your raw mVertexBuffer
OpenGL vertex list if you're not using a Mesh
) contains valid (u, v) texture coordinates with each vertex. These coordinates will refer to the "current" texture at render time. See https://code.google.com/p/libgdx/wiki/MeshColorTexture#Texture.
First, define your Mesh to include texture information with each vertex:
mesh = new Mesh(true, 3, 3,
new VertexAttribute(Usage.Position, 3, "a_position"),
new VertexAttribute(Usage.ColorPacked, 4, "a_color"),
new VertexAttribute(Usage.TextureCoordinates, 2, "a_texCoords"));
(This mesh has color data too, but you can leave off the line with "a_color" if you don't want to blend in raw color data.)
The other prepatory step is to load your texture file into a Libgdx texture. Assuming the file is packaged in the standard "assets" location, you should just be able to load it with the relative path:
Texture texture = new Texture("path/to/image.bmp");
Libgdx can generally load and parse BMP, PNG, and JPEG file formats. For other formats, I think you'd need to create a Pixmap
object yourself, and then plug that into a Texture
.
Second, make sure each vertex has (u,v) texture coordinates associated with it:
mesh.setVertices(new float[] { -0.5f, -0.5f, 0, Color.toFloatBits(255, 0, 0, 255), 0, 1,
0.5f, -0.5f, 0, Color.toFloatBits(0, 255, 0, 255), 1, 1
0, 0.5f, 0, Color.toFloatBits(0, 0, 255, 255), 0.5f, 0 });
Each line above defines the "(x, y, z, rgba, u, v)" for each vertex. (The "rgba" is the whole color squashed into a single float.)
At this point you have defined a Mesh
with a Color
information and texture (u, v) coordinates.
The second step is, during the render
call, to bind your texture, so the (u, v) coordinates in the Mesh
have something to reference:
texture.bind();
mesh.render(GL10.GL_TRIANGLES, 0, 3);
You don't need to direclty use any of the OpenGL texture or vertex APIs.
Upvotes: 2