Reputation: 2796
We have a project where we just switched our use of Sensor type from type TYPE_ROTATION_VECTOR to TYPE_GAME_ROTATION_VECTOR.
According to google's doc: Identical to TYPE_ROTATION_VECTOR except that it doesn't use the geomagnetic field. Therefore the Y axis doesn't point north, but instead to some other reference, that reference is allowed to drift by the same order of magnitude as the gyroscope drift around the Z axis.
My question is (being an android noob) how can one 'calibrate' that reference to the current device's orientation? meaning from the starting orientation the device is at before onSensorChanged enters into play?
what we need is orientation data from a reference frame, that reference frame being the initial device orientation of the device in space (so deltas are needed, not absolute rotations)
any help is highly appreciated, i'm an iOS developer mainly and this is all of 1 line of code there :S
Upvotes: 2
Views: 939
Reputation: 28948
how can one 'calibrate' that reference to the current device's orientation? meaning from the starting orientation the device is at before onSensorChanged enters into play?
You could literally do the maths. Measure the yaw/heading at the start, and compare it to your current value over time. This number wouldn't be affected by magnetic interference, unlike TYPE_ROTATION_VECTOR
.
The trade off is your reference can drift.
Just answering this question because I was researching out about TYPE_GAME_ROTATION_VECTOR
.
PS: I would guess the reason you'd switch to TYPE_GAME_ROTATION_VECTOR
is to avoid magnetic interference with surroundings. People usually play games indoors, where the compass might interfere with hard/soft iron.
Upvotes: 0