Reputation: 1254
i'm using gluUnProject coverting sreen coordinates to world coordinates, but unfortunetaly i get to small value as a return. It converts (0,0) to about (-0.80,0.40), but it should convert to about (4,2)... The code:
((GL11) gl).glGetIntegerv(GL11.GL_VIEWPORT, viewport, 0);
((GL11) gl).glGetFloatv(GL11.GL_MODELVIEW_MATRIX, modelViewMatrix, 0);
((GL11) gl).glGetFloatv(GL11.GL_PROJECTION_MATRIX, projectionMatrix, 0);
GLU.gluUnProject(main.x, main.y, 1, modelViewMatrix, 0, projectionMatrix, 0, viewport, 0, pointInPlane, 0);
xWcoord = pointInPlane[0];
yWcoord = pointInPlane[1];
zWcoord = pointInPlane[2];
main.x and main.y are screen coordinates. And this is the definition of other variables:
public static int[] viewport = new int[16];
public static float[] modelViewMatrix = new float[16];
public static float[] projectionMatrix = new float[16];
public static float[] pointInPlane = new float[16];
Upvotes: 2
Views: 1258
Reputation: 1174
gluUnProject requires the z depth of the point you wish to unproject from window coordinates to world coordinates. In your code, you're using '1' for the depth of the point, which is why it is giving you a result vector that has a length of approximately 1.
To get the depth of the particular pixel for unprojecting, you would ordinarily do something like this:
float z;
glReadPixels( main.x, main.y, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &z );
If you then pass that 'z' value through to gluUnProject instead of '1', that should give you the correct world position.
Upvotes: 3