Mingebag
Mingebag

Reputation: 758

Ideas how can i measure distance of an detected object from Camera using opencv for IOS?

Story:

I have an app that is detecting faces build with opencv on iphone sdk 6.0 every thing is working i can use the front camer and the "back camera " also i can turn on/off the torch as well i can display how much fps i have at the moment (live) .

but now i want to add a additional feature that you can see how fare away is this face

enter image description here

are there some formulas or some good ideas or evan a solution from a pro ? (if you need some code or something i can show you every thing that you need )

Thx for help and fast answers !^^

hear is for every one a translation for objective c

-(double) faceDistance{
    const NSInteger kFaceWidth = 10;
    const NSInteger kPicWidth = 640;
    CGPoint p1 = CGPointMake(10, 10);
    CGPoint p2 = CGPointMake(100, 70);
    int facePixelWidth;


    int slope = (p2.y - p1.y) / (p2.x - p1.x);

    double propOfPic = kPicWidth / facePixelWidth;
    double surfSize = propOfPic * kFaceWidth;
    double dist = surfSize / slope;

    return dist;
}

I have update my code here is the solution for objective c

Upvotes: 4

Views: 4209

Answers (1)

Kyle
Kyle

Reputation: 893

I have done some work in image processing/AR, but certainly would not consider myself an expert. That said let's move on to how I would approach this problem:

If you are figuring out how far away a face is, you could approximate it based on average size of a face. What I mean is that human faces are all fairly similar in size (they don't deviate by more than a few cm in any direction). So if you do some testing, you could figure out how many pixels a face is at several distances and use trig to approximate other distances. Obviously you couldn't calculate extremely accurate distances with this method, but you differentiate between someone who is half a meter away vs someone who is 3 meters away.

You would need to know the resolution of the camera you are using and likely do some testing with each possible resolution.

Update:

Ok, so I have been asked to elaborate a little more. The amount of space seen in a picture is linearly related to how far the surface is from the camera.

Assume that a human face is 10 cm wide

Ok, let's point the camera at a meter stick at a right angle. Hold the camera 10 cm away. How much does it see? Let's say that it sees 10 cm of the meter stick.

Now move the camera to 1 m away. Let's say that it now sees 70 cm. Using this information, we can generate the following Java:

public static final int FACE_WIDTH = 10;
public static final int PIC_WIDTH = 640; // In pixels
Point p1 = new Point(10,10);
Point p2 = new Point(100,70);
int slope = (p2.y-p1.y)/(p2.x-p1.x);

public static double faceDistance(int facePixelWidth)
{
    double propOfPic = PIC_WIDTH / facePixelWidth;
    double surfSize = propOfPic * FACE_WIDTH;
    double dist = surfSize / slope;
    return dist;
}

This should work. I have not tested it, but mathematically it seems correct.

Upvotes: 5

Related Questions