motion capture with a Kinect v1 in processing
Hello there I was wondering if anyone could help me with something
I have recently been giving a task to do from teachers at college and. I hope to achieve this is through motion capture.
The other lecturers' teacher sound art and film art, so I plan to create a program that will track the participant's movements and displaying the movement on screen with ether set or random colours.
I would also like use to the sound part of this project through the participant's movements, but by either changing the pitch of noise through movement or by changing the speed of the sound through movement.
I have manged to get a 360 xbox Kinect 1414 to work in processing and what played around with the motion tracking but can’t seem to figure out how to attach an ellipse to the hands. I hope someone can help me and that it doesn’t seem much of a hellish task.
if you can help here is my email address ([email protected])
(if this is impossible I would understand as I tend to make life difficult for myself haha)
Answers (1)
You will need a middleware library that can provide skeleton tracking data from depth data.
One option on Windows is the Kinect for Windows Processing library which uses the Kinect SDK.
There is another library called SimpleOpenNI which works on multiple operating systems.
The official version is not longer updated for Processing 3 (does work with Processing 2.2.1 though.). Fortunately you can find an updated fork of the SimpleOpenNI library on github
To manually install the library:
- select the version of the library for your version of Processing (e.g. for Processing 3.5.3 go to SimpleOpenni Processing_3.5.3). It should be one of 3.5.3, 3.5.2, 3.4, 3.3.7, 3.3.6, or 2.2.1 (otherwise you may to install one of these Processing versions)
- Click Clone or download > Download ZIP (on the top right side of the repo)
- Unzip and the contents and within the folder select the SimpleOpenNI folder that contains a folder named library:
- Move this nested SimpleOpenNI folder (containing the library folder) to Documents/Processing/libraries
- Restart Processing (if it was already running)
- Go to Processing > Examples > Contributed Libraries > SimpleOpenNI > OpenNI and start playing with the examples
Other notes:
- To track a user start with the User and User3d examples
- notice
context.getCoM()
returns the centre of mass: a single point while context.getJointPositionSkeleton()
can get you position of a hand in 3D
- you can use
context.convertRealWorldToProjective()
to convert from a 3D position to a project 2D position on screen
- Once the skeleton tracking is locked to a person you can get the joint position for each hand, but it's worth noting there's a separate hand tracker functionality: checkout the Hands / Hands3d examples. Depending on how you want to track participants / what the environment is / what the motions choose the option that works the best
- Speaking of the environment bare in mind the Xbox 360 kinect is susceptible to infrared light interference (for example bright incandescent lights, direct sunlight, etc.): this will deterioriate the depth map quality which in turn affects skeleton tracking. You would want to have as much control over lighting as possible and have ideal lighting conditions.
- test ! test ! test ! :) think of the interaction and the environment (sketching on paper first can be useful), for each assumption run a basic test to prove that it works or not. Use iterations to learn how to change either the environment or interaction to make it work.
- Check out the RecorderPlay example: it records a .oni file which contains both RGB and depth data. This is super useful because it allows you to record on site in areas where you might have limited time access and it will save you time not having to go back and forth between your computer and in front of the kinect. (Once you initialize SimpleOpenNI with the path to the .oni file (e.g.
context = new SimpleOpenNI(this,recordPath);)
you can run the skeleton tracking and everything using the recording
If you want to see more about Kinect and Processing check out Daniel Shiffman's Getting started with Kinect and Processing page
Have fun!