Reputation: 23
I started drawing hexagonal grid on canvas with Path object. I did all the calculations on how many hexagons go on a particular display, depending on the hexagon size.
I have all my hexagon coordinates calculated depending on canvas. But now, I have serious performance issues and have to port this to OpenGL.
Because the algorithm works in canvas, I'm trying to convert those "canvas" hexagon coordinates to OpenGL coordinate system with GLU.gluUnProject.
"for loop"
float near[] = { 0.0f, 0.0f, 0.0f, 0.0f };
GLU.gluUnProject(b.hexes[ii][jj].points[ii].x,
b.hexes[ii][jj].points[ii].y, 0, mg.mModelView, 0,
mg.mProjection, 0, viewport, 0, near, 0);
vertices[zz] = (float) ((near[0])/near[3]);
zz++;
vertices[zz] = (float) ((near[1])/near[3]);
zz++;
vertices[zz] = 0;
Because I lack opengl knowledge, I dont know how to set glViewport,gluPerspective,glTranslatef to 2D world that is "the same" as canvas.
So my question is:
How to set those three thing that my first hexagon (on canvas, the first one is top left) to be equal on openGL drawing surface world?
Update
Thank you all for you're interest of helping on my problem. But:
I now set my 18 float vertices array as follows: [20.0, 0.0, 0.0, 60.0, 0.0, 0.0, 80.0, 34.641018, 0.0, 60.0, 69.282036, 0.0, 20.0, 69.282036, 0.0, 0.0, 34.641018, 0.0] Z is always 0.
ByteBuffer vertexByteBuffer = ByteBuffer
.allocateDirect(vertices.length * 4);
vertexByteBuffer.order(ByteOrder.nativeOrder());
vertexBuffer = vertexByteBuffer.asFloatBuffer();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
and draw:
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glColor4f(0.0f, 1.0f, 0.0f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_FAN, 0, vertices.length/3);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
Before draw I set: onDrawFrame()
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glLoadIdentity();
gl.glTranslatef(0.0f, 0.0f, -5.0f);
onSurfaceChanged()
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
gl.glOrthof(0, width, height, 0, -1, 1);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
Upvotes: 1
Views: 1445
Reputation: 5238
You have no need to transform yourself the vertices coordinates. All you have to do is to provide the correct projection matrix to OpenGL.
Basically, this would be something along the lines of :
glViewport(0, 0, canvasWidth, canvasHeight);
glMatrixMode(GL_PROJECTION); // or some matrix uniform if using shaders
glLoadIdentity();
glOrtho(0, canvasWidth, canvasHeight, 0, -1, 1); // this will allow to pass vertices in 'canvas pixel' coordinates
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
Depending on the order in which you're passing your vertices, you might want to make that culling is disabled.
Upvotes: 1