Reputation: 544
I would like to compare my poses obtained from a webcam to that of a pose obtained from an image. The base code for the pose estimation is from: https://github.com/opencv/opencv/blob/master/samples/dnn/openpose.py
How can I compare my own poses live-time with an image's pose, and return True if the two poses match within some threshold?
For instance, if I put my arms in a certain up to match an image of someone doing the same, how could I get a result of how close the match is?
What would be a way of doing this / where could I find more information on this?
Upvotes: 1
Views: 852
Reputation: 1249
As you can see here, the result of the detected human pose is indexed from 0 to 17.
You can use L2 distance to measure the distance between all pairs.
E.g., for the 0-th joint:
(J0[0] - J1[0])*(J0[0] - J1[0])
More about the output of openpose.
Actually, openpose give you not only (x,y)
but also a confidence
score from 0-1. You can get this score involved.
For example, in my project:
(J0[0] - J1[0])*(J0[0] - J1[0])*confidance
Upvotes: 1