Reputation: 6488
I am trying to learn sensor fusion and for that I have recorded raw data for Accelerometer, Gryoscope and magnetometers via an android app.
I came across Kalman filters but they are too complex to understand and I do not want to just take any code and implement it without proper understanding.
I then found this link for Complementary filter and that looks very promising as it is very easy to understand. So I have the following doubt. (This is the first time I am dealing with all these sensors so I am going to ask all questions I have)
The Complementary filter takes signals from sensors and outputs orientation in terms of Pitch, Roll and Yaw. Does that mean it filters the time domain signals and provides output in terms of angles? In that case, is it possible to obtain filtered time domain signal?
I came across this famous Google Talk video and in that he mentions that to obtain Linear Acceleration, you need to subtract Gravity from Raw Accelerometer data. How do I obtain the Gravity vector?
Also I am slightly confused about why the Acceleration signal has to be converted to Earth coordinate system. I have read some documents but I am still confused. I can see it why it is done but how the required Rotation matrix is calculated.
Last (but surely not final), how do I estimate heading?
So basically, I have the sensor data and I want to track the orientation of the device and in which direction the person is heading. The questions may sound very basic but I need some clarification from experts on this topic, so I can then go and work on some fancy algorithms.
I would really appreciate if someone can point me in right direction.
Best Regards
Chintan
Upvotes: 3
Views: 2924
Reputation: 124
From my experience there is no AHRS-Algorithm that can compete with an extended kalman filter in means of accuracy. And accuracy is very important if you want to calculate the user-acceleration, because the inaccuracy of your rotation-matrix will result in a drift in your user-acceleration.
To question 1: I dont understand exactly what you mean with filtered time domain signals. The measurement samples always provide a timestamp.
Answer to 2 and 3:
To calculate the user-acceleration you need to calculate the attitude (rotation-matrix) beforehand. Because you need to rotate the incoming ACC-Data with the attitude calculated by your AHRS-Algorithm to get it from "phone-space" to "world-space". So that an Up-Movement of the phone (no matter which orientation) will always result in an increased Y Value in your user-acceleration. I hope you get what i mean. Now we have the Raw-Accelerometer-Data in world-space and subtract the gravity ( vector3(0,9.81f, 0) ). So that our new user-acceleration always shows (0,0,0) if there is no movement.
This was the easy part. We now have the user-acceleration in world space. But we want a positional offset (the path). You can not just integrate acceleration to velocity and afterwards velocity to path/way. (Excuse my english ;-)) Because the measuring samples of the accelerometer are never exact enough to derive 2 times to the path. You have to program constraints to control your derived velocity, so that it will be set back to zero if the value and slope of the acceleration is zero. Otherwise there will always be a remaining amount of velocity resulting in a huge drift of the calculated path over time. I think for the best inside-out positional tracking you will need to do some analysis on the (world-space) user-acceleration and reconstruct a clean velocity-graph, to get smooth movements always returning to zero, when there is no acceleration. I programmed this myself, and it works but it is not exact. One problem is, that the recognized movement is depending on the velocity/accelration. The slower the movements are, the lower are the values of the accelerometer, until they get lost in the sensor-noise. Another problem is to recognize, when a movement has ended to remove all its influence on the resulting velocity.
The Magnetometer-sensor is not needed for the AHRS-Algorithm, because it is not reliable enough and will always introduce errors. The magnetometer can be affected to much by the environment. For example look at the google Cardboard magnetometer switch. Open up a sensor-test-app look at the magnetometer-sensor when you pull the google Cardboard Trigger. It will produce a huge value on the magnetometer which will not represent the heading at all. Same things may happen with microwaves etc. So to get good north heading you constantly have to check if the direction and magnitude of the magnetic field hasnt changed since a specific time and are reasonable values. Then you can use the magnetometer-data as a reference for rotating your Orientation-Rotation-Matrix you got from the AHRS-Algorithm to correct the heading to north.
Answer to 4: You get the heading from your rotation Matrix.
vector3 headingDirection = new vector3(rotMat[8], rotMat[9], rotMat[10]);
Depending on the form of your rotation Matrix (column major or row major) you may need to adjust the indices. Look at the answer from John Schultz here: http://www.gamedev.net/topic/319213-direction-vector-from-rotation-matrix/
The rotationMatrix should be estimated by adding the current rotation speed (gyroscope) multiplied with the elapsed time between your last estimated rotation and now.
Annotation:
I think if you want to play around with sensor fusion and user-acceleration it may be the best to use the extended kalman filter from the cardboard.jar as a starting point. You can compare it to your algorithm.
Although it has a method for using the magentometer (processMag), this method never gets called in the cardboard-api.
The Method "getPredictedGLMatrix" in the linked file shows how google is estimating the "current" rotation Matrix.
I hope this answers some of your questions.
Upvotes: 2