Reputation: 7216
We are currently working on an app that will display objects superimposed on the camera view (basically an augmented reality), but are finding that the objects don't appear "real". The reason for this is because the size of our objects do not change in size the same way that objects in real life do.
For example, let's say we placed an object in augmented reality at lat = 43, long = -70. If we started walking closer to this point with our phones, we should see the object getting larger as we get closer, but as of right now that increase in size is linear. However, as we started walking closer to the object in real life we would see it getting bigger according to a specific function.
Any ideas as to what that function might be?
Thank you.
Upvotes: 2
Views: 314
Reputation: 7216
I believe I've found the answer here: http://en.wikipedia.org/wiki/Visual_angle. Looks like the proper formula to use is tan(2 * arctan(realSize / [2 * distance])) * screenSize.
Someone please correct this if I made a mistake with my math.
Upvotes: 1
Reputation: 1575
It should be linear relatively to certain point behind the camera.
Imagine a 3D world where you project an image towards a plane (take a picture). If you keep it at that, objects won't change size when your plane (camera) moves. That would be an orthographic projection (http://en.wikipedia.org/wiki/Orthographic_projection).
A real-world camera works like a perspective projection (http://en.wikipedia.org/wiki/Perspective_(graphical)), except the point is ahead of the camera and the image gets inverted on the plane. This means the size of objects on your image is a function such as this
realSize * (constantDistanceBetweenPlaneAndPoint / distanceFromPoint)
I do not know enough about cameras to tell you if this value is documented, but for a certain zoom factor, the value should be constant. It's just a matter of figuring it out experimentally.
Upvotes: 1