Reputation: 310
Essentially, what is happening is there is some strange warping of the 3D cube being rendered by my raytracer, which continues to worsen as the camera moves up, even if the cube is in the same location on the screen.
The code is at http://pastebin.com/HucgjRtx
Here is a picture of the output:
http://postimg.org/image/5rnfrlkej/
EDIT: Problem resolved as being that I was just calculating the angles for vectors wrong. The best method I have found is creating a vector based on your FOV (Z) current pixel X, and current pixel Y, then normalizing that vector.
Upvotes: 0
Views: 137
Reputation: 6570
It looks like you're calculating rays to cast based on Euler angles instead of the usual projection.
Typically a "3D" camera is modeled such that the camera is at a point with rays projecting through a grid spaced some distance from it... which is, incidentally, exactly like looking at a monitor placed some distance from your face and projecting a ray through each pixel of the monitor.
The calculations are conceptually simple in fixed cases.. e.g.
double pixelSpacing = 0.005;
double screenDistance = 0.7;
for (int yIndex= -100; yIndex<= 100; yIndex++)
for (int xIndex= -100; xIndex<= 100; xIndex++) {
Vector3 ray = new Vector3(
xIndex * pixelSpacing,
yIndex * pixelSpacing,
screenDistance
);
ray = vec.normalize();
// And 'ray' is now a vector with our ray direction
}
You can use one of the usual techniques (e.g. 4x4 matrix multiplication) if you want to rotate this field of view.
Upvotes: 5