Silex
Silex

Reputation: 2713

Distance between the camera and a recognized "object"

I would like to calculate the distance between my camera and a recognized "object". The recognized "object" is a black rectangle sticker on a white board for example. I know the values of the rectangle (x,y).

Is there a method that I can use to calculate the distance with the values of my original rectangle, and the values of the picture of the rectangle I took with the camera?

I searched the forum for answeres, but none of the were specified to calculate the distance with these attributes.

I am working on a robot called Nao from Aldebaran Robotics, I am planing to use OpenCV to recognize the black rectangle.

Upvotes: 4

Views: 2965

Answers (3)

Dave Cote
Dave Cote

Reputation: 1

I have been working on Image/Object Recognition as well. I just released a python programmed android app (ported to android) that recognizes objects, people, cars, books, logos, trees, flowers... anything:) It also shows it's thought process as it "thinks" :) I've put it out as a test for 99 cents on google play. Here's the link if you're interested, there's also a video of it in action: https://play.google.com/store/apps/details?id=com.davecote.androideyes

Enjoy! :)

Upvotes: -2

Francesco Callari
Francesco Callari

Reputation: 11825

It is a big topic. If you want to proceed from a single image, take a look at this old paper by A. Criminisi. For an in-depth view, read his Ph.D. thesis. Then start playing with the OpenCV routines in the "projective geometry" sectiop.

Upvotes: 1

Rich
Rich

Reputation: 15464

If you could compute the angle taken up by the image of the target, then the distance to the target should be proportional to cot (i.e. 1/tan) of that angle. You should find that the number of pixels in the image corresponded roughly to the angles, but I doubt it is completely linear, especially up close.

The behaviour of your camera lens is likely to affect this measurement, so it will depend on your exact setup.

Why not measure the size of the target at several distances, and plot a scatter graph? You could then fit a curve to the data to get a size->distance function for your particular system. If your camera is close to an "ideal" camera, then you should find this graph looks like cot, and you should be able to find your values of a and b to match dist = a * cot (b * width).

If you try this experiment, why not post the answers here, for others to benefit from?

[Edit: a note about 'ideal' cameras]

For a camera image to look 'realistic' to us, the image should approximate projection onto a plane held infront of the eye (because camera images are viewed by us by holding a planar image in front of our eyes). Imagine holding a sheet of tracing paper up in front of your eye, and sketching the objects silhouette on that paper. The second diagram on this page shows sort of what I mean. You might describe a camera which achieves this as an "ideal" camera.

Of course, in real life, cameras don't work via tracing paper, but with lenses. Very complicated lenses. Have a look at the lens diagram on this page. For various reasons which you could spend a lifetime studying, it is very tricky to create a lens which works exactly like the tracing paper example would work under all conditions. Start with this wiki page and read on if you want to know more.

So you are unlikely to be able to compute an exact relationship between pixel length and distance: you should measure it and fit a curve.

Upvotes: 2

Related Questions