PolGraphic
PolGraphic

Reputation: 3364

Wrong aspect ratio calculations for camera (simple ray-caster)

I am working on some really simple ray-tracer.

For now I am trying to make the perspective camera works properly.

I use such loop to render the scene (with just two, hard-coded spheres - I cast ray for each pixel from its center, no AA applied):

Camera * camera = new PerspectiveCamera({ 0.0f, 0.0f, 0.0f }/*pos*/, 
{ 0.0f, 0.0f, 1.0f }/*direction*/, { 0.0f, 1.0f, 0.0f }/*up*/,
buffer->getSize() /*projectionPlaneSize*/);
Sphere * sphere1 = new Sphere({ 300.0f, 50.0f, 1000.0f }, 100.0f); //center, radius
Sphere * sphere2 = new Sphere({ 100.0f, 50.0f, 1000.0f }, 50.0f);

for(int i = 0; i < buffer->getSize().getX(); i++) {
    for(int j = 0; j < buffer->getSize().getY(); j++) {
//for each pixel of buffer (image)
        double centerX = i + 0.5;
        double centerY = j + 0.5;

        Geometries::Ray ray = camera->generateRay(centerX, centerY);
        Collision * collision = ray.testCollision(sphere1, sphere2);
        if(collision){
            //output red
        }else{
            //output blue
        }
    }
}

The Camera::generateRay(float x, float y) is:

Camera::generateRay(float x, float y) {
    //position = camera position, direction = camera direction etc.
    Point2D xy = fromImageToPlaneSpace({ x, y });
    Vector3D imagePoint = right * xy.getX() + up * xy.getY() + position + direction;
    Vector3D rayDirection = imagePoint - position;
    rayDirection.normalizeIt();
    return Geometries::Ray(position, rayDirection);
}

Point2D fromImageToPlaneSpace(Point2D uv) {
    float width = projectionPlaneSize.getX();
    float height = projectionPlaneSize.getY();
    float x = ((2 * uv.getX() - width) / width) * tan(fovX);
    float y = ((2 * uv.getY() - height) / height) * tan(fovY);
    return Point2D(x, y);
}

The fovs:

double fovX = 3.14159265359 / 4.0;
double fovY = projectionPlaneSize.getY() / projectionPlaneSize.getX() * fovX;

I get good result for 1:1 width:height aspect (e.g. 400x400): enter image description here

But I get errors for e.g. 800x400: enter image description here

Which is even slightly worse for bigger aspect ratios (like 1200x400): enter image description here

What did I do wrong or which step did I omit?

Can it be a problem with precision or rather something with fromImageToPlaneSpace(...)?

Upvotes: 1

Views: 544

Answers (2)

legalize
legalize

Reputation: 2251

From the images, it looks like you have incorrectly defined the mapping from pixel coordinates to world coordinates and are introducing some stretch in the Y axis.

Skimming your code it looks like you are defining the camera's view frustum from the dimensions of the frame buffer. Therefore if you have a non-1:1 aspect ratio frame buffer, you have a camera whose view frustum is not 1:1. You will want to separate the model of the camera's view frustum from the image space dimension of the final frame buffer.

In other words, the frame buffer is the portion of the plane projected by the camera that we are viewing. The camera defines how the 3D space of the world is projected onto the camera plane.

Any basic book on 3D graphics will discuss viewing and projection.

Upvotes: 1

Craig Estey
Craig Estey

Reputation: 33631

Caveat: I spent 5 years at a video company, but I'm a little rusty.

Note: after writing this, I realized that pixel aspect ratio may not be your problem as the screen aspect ratio also appears to be wrong, so you can skip down a bit.

But, in video we were concerned with two different video sources: standard definition with a screen aspect ratio of 4:3 and high definition with a screen aspect ratio of 16:9.

But, there's also another variable/parameter: pixel aspect ratio. In standard definition, pixels are square and in hidef pixels are rectangular (or vice-versa--I can't remember).

Assuming your current calculations are correct for screen ratio, you may have to account for the pixel aspect ratio being different, either from camera source or the display you're using.

Both screen aspect ratio and pixel aspect ratio can be stored a .mp4, .jpeg, etc.

I downloaded your 1200x400 jpeg. I used ImageMagick on it to change only the pixel aspect ratio:

convert orig.jpg -resize 125x100%\! new.jpg

This says change the pixel aspect ratio (increase the width by 125% and leave the height the same). The \! means pixel vs screen ratio. The 125 is because I remember the rectangular pixel as 8x10. Anyway, you need to increase the horizontal width by 10/8 which is 1.25 or 125%

Needless to say this gave me circles instead of ovals.

Actually, I was able to get the same effect with adjusting the screen aspect ratio.

So, somewhere in your calculations, you're introducing a distortion of that factor. Where are you applying the scaling? How are the function calls different?

Where do you set the screen size/ratio? I don't think that's shown (e.g. I don't see anything like 1200 or 400 anywhere).

If I had to hazard a guess, you must account for aspect ratio in fromImageToPlaneSpace. Either width/height needs to be prescaled or the x = and/or y = lines need scaling factors. AFAICT, what you've got will only work for square geometry at present. To test, using the 1200x400 case, multiply the x by 125% [a kludge] and I bet you get something.

Upvotes: 1

Related Questions