Reputation: 863
I've run into an issue with OpenGL coordinate systems.
I've got a small class that creates a hexagon using this code:
for(int i = 0; i < 6; ++i) {
gl.glVertex2d(radius * Math.sin(i/6.0*2*Math.PI)-(radius*(x*1.75)),
radius * Math.cos(i/6.0*2*Math.PI)+(radius*(y*1.25)));
}
Note, all this is in java using the JOGL library.
When the Hexagons are created, they have the coordinate system with the center of the screen as 0,0 and the top left set as -1,1
I want to convert these coordinate into screen coordinates, so the top left is 0,0 and the center is windowX/2, -windowY/2. So, if the window is 500 pixels by 500 pixels, the center would be 250,-250.
Upvotes: 1
Views: 1682
Reputation: 43359
Those coordinates (where <0,0>
is the center and <-1,1>
is the top left) are what are known as Normalized Device Coordinates. They are usually an intermediate coordinate space that pixels are in after projection but before viewport mapping.
Now, the interesting thing about this coordinate space is, if you use an identity matrix for Projection
and ModelView
you can use these coordinates exactly as they are. That is, you will never need to know what the dimensions of your viewport are to draw a hexagon that fills it entirely.
Otherwise, you should use an orthographic projection matrix that uses the same dimensions as your window. glOrtho (...)
is the traditional way of doing this...
glMatrixMode (GL_PROJECTION);
glLoadIdentity ();
glOrtho (0.0, 500.0, -500.0, 0.0, -1.0, 1.0);
I am curious, why do you want the Y coordinate to range from 0 to -500 in this example instead of 0 to 500?
Upvotes: 1