damian
damian

Reputation: 2011

Reliable Sensor Fusion on Android?

Currently working on an augmented reality Application, I am experimenting with the sensors. I've tried using raw sensor data, combining accelerometer + magnetic_field with rotationmatrices either via SensorManager or with one ones, but with no real success.

Reading more about sensor fusion, I've seen some applications in the play store that managed to get the compass functionality with both pitch and roll to work, for example this one: 3D Gyro Compass (https://play.google.com/store/apps/details?id=fi.finwe.gyrocompass&hl=de) - the sensor fusion code is under LGPL so I've decided to give it a shot but my readings are still way off and the orientation is somewhat screwed over, so seems like they are doing a bit more.

Are there any solutions to the android orientation, e.g. letting me use them in some sort of a compass / AR application? Google didn't show me anything useful so far.

Upvotes: 1

Views: 6132

Answers (2)

Sean Barbeau
Sean Barbeau

Reputation: 11756

In the most recent version of my GPSTest app on Github (master branch), I'm using the TYPE_ROTATION_VECTOR sensors to control both the orientation and tilt of the map camera (Android Maps API v2). This has the "magic camera" effect (as discussed in this Google I/O talk) so when you hold up the device in front of you, you see a first-person view of the map based on your orientation, and when you set the device down, it transitions to an overhead view (much like an AR app). It has reasonable performance, in my opinion.

enter image description here

Note that you will need to remap the coordinate system to handle device orientation changes (i.e., portrait or landscape), which might be a source of some of your issues.

See this StackOverflow answer for code snippets of using TYPE_ROTATION_VECTOR on devices that support TYPE_ROTATION_VECTOR (Android 2.3 and up), using TYPE_ORIENTATION on Android 2.2 and lower, re-mapping the coordinate system based on device orientation, and handling Samsung-specific bugs related to the SensorManager.getRotationMatrixFromVector() method. Full implementation can be seen in GPSTest on Github (master branch).

Upvotes: 2

jranalli
jranalli

Reputation: 750

To my knowledge, the best thing to use for sensor fusion on Android is TYPE_ROTATION_VECTOR. I use this for the orientation in my Augmented Reality app. Its a composite sensor whose output is generated automatically using the best available data from acceleration, magnetometer and the gyroscope. If you want nice Roll-Pitch-Azimuth type coordinates instead of the RotationVector output, assuming you've got just the RotationVector sensor, you can do something like this:

public void onSensorChanged(SensorEvent event) {
    float[] lastRotVal = new float[3];

    try{
        System.arraycopy(event.values, 0, lastRotVal, 0, event.values.length); 
    } catch (IllegalArgumentException e) {
        //Hardcode the size to handle a bug on Samsung devices running Android 4.3
        System.arraycopy(event.values, 0, lastRotVal, 0, 3); 
    }

    SensorManager.getRotationMatrixFromVector(rotation, lastRotVal);
    SensorManager.getOrientation(rotation, orientation);

    double pitch = orientation[1];
    double roll = orientation[2];
    double azimuth = orientation[3];

    //Use them for something!
}

If you want to support older devices that don't support RotationVector, you could use the TYPE_ORIENTATION sensor, which is deprecated in the new API versions. In that case, you should test for the API version and choose which to use on that basis.

If you want to code your own custom fused sensor, it's certainly possible. I would say that it is unlikely to beat the robustness of these guys unless you're doing your own individual research on best practices.

Upvotes: 1

Related Questions