kumarchandresh
kumarchandresh

Reputation: 590

How does Android calculate Rotation Vector?

Android documentation says here (just read top 4-5 lines) that Rotation Vector Sensor is software based. But most of Android devices actually have 3 sensor, viz. accelerometer, gyroscope, and magnetometer. So there must be a conversion algorithm which converts data from these actual sensors to the virtual rotation vector sensor. But I was unable to find any article or source code where I can see the algorithm that was used to calculate the rotation vector. If someone had any experience in this area, may be he can point me to the right direction.

I need it to know if it is at all necessary to get the rotation data from rotation vector sensor, or I can compute it myself using the hardware-based sensors.

Upvotes: 3

Views: 1773

Answers (1)

cantunca
cantunca

Reputation: 446

The implementation accessible through the following url uses an Extended Kalman Filter, which is a pretty standard way of fusing accelerometer, gyroscope and (optionally) magnetometer data:

https://android.googlesource.com/platform/frameworks/native/+/refs/heads/master/services/sensorservice/Fusion.cpp

The key methods are predict and update which correspond to respective stages of the Kalman Filter. The output is a quaternion (gettable by getAttitude method), which is a convenient way to represent 3D orientations. The term "Rotation Vector" is just an alias used by Android for a quaternion.

Other alternatives for sensor fusion include Madgwick and Mahony filters. There are plenty of resources out there (and even many open implementations).

Upvotes: 4

Related Questions