AlbeyAmakiir
AlbeyAmakiir

Reputation: 2247

How to get Android phone orientation matching human orientation?

I'm making a map app, including the location arrow that shows you which way you're facing, like so:

Map with a blue arrow

I get the orientation directly from SensorManager.getOrientation(), using the first returned value: azimuth. When the phone is held so that the screen is pointing above the horizon, and in portrait, the arrow works fine. However:

The carefully constructed and scientific image below shows what I mean (where blue is the user's facing, red is the arrow direction, the screen is approximately facing the user's face, and Google Maps does exactly what I want):

Graph of what happens vs what I want

(Note that, with Google Maps, it doesn't successfully do the last two actions on the list if auto-rotate is off. But I'm not even at that stage, yet. One thing at a time.)

It appears as though it's simply using the Y axis pointing direction as shown here: http://developer.android.com/reference/android/hardware/SensorEvent.html, when I want it to use the reverse of the Z axis pointing direction, most of the time, and the Y when the phone is flat. However, given the values that getOrientation() returns, I'd have to write complex cases to fix some of the issues, and the phone-facing-horizon use case is unsolvable. I'm certain there's an easier way.

Here's my code (Where lastAcceleration and lastMagneticField both came from the internal sensor):

float[] rotationMatrix = new float[9];
if(SensorManager.getRotationMatrix(rotationMatrix, null, lastAcceleration, lastMagneticField)){
    float[] orientMatrix = new float[3];
    SensorManager.getOrientation(rotationMatrix, orientMatrix);

    orientation = orientMat[0]*180/(float)Math.PI;
}

What am I doing wrong? Is there an easier way to do this?

Edit: Just to clarify, I'm making the assumption that the user is holding the device in front of them, and the screen is pointing towards them. Beyond that, I obviously can't tell if only one of them rotates. Also, I am using the motion of the user when they are moving, but this is for when they are stationary.

Upvotes: 31

Views: 11277

Answers (5)

Momo
Momo

Reputation: 11

I suggest to apply the following remap. In my case, it works for the first three scenarios: SensorManager.remapCoordinateSystem(rMat, SensorManager.AXIS_Z, SensorManager.AXIS_Y, rMatRemapped);

Upvotes: 1

patelb
patelb

Reputation: 2581

You should take the pitch and determine if the user is close to holding the phone vertically up.

i chose that after 45 degree pitch up or down from flat on table, the coordinate system should be remapped.

if (Math.round(Math.toDegrees(-orientation[1])) < 45 && Math.round(Math.toDegrees(-orientation[1])) > -45) {
//do something while the phone is horizontal
}else{
//R below is the original rotation matrix
float remapOut[] = new float[9];
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, remapOut);
//get the orientation with remapOut
float remapOrientation[] = new float[3];
SensorManager.getOrientation(remapOut, remapOrientation);

It has worked out pretty well. Let me know if anyone can suggest an improvement on this. Thanks.

Upvotes: 1

Hoan Nguyen
Hoan Nguyen

Reputation: 18151

Did you call remapCoordinateSystem? Otherwise, you only get the right facing value when the phone is hold vertically. For the case When the phone is held so the screen's facing is level with the horizon, there is no way you can get the user's facing. Because to get the facing you have to project the z value of the of the sensor reading into the xy plane in the world coordinate and it is zero when the device is held horizontally.

To be more precise, if you want to get the phone facing then the phone has to be inclined at least about 25 degrees from horizontal and you have to call remapCoordinateSystem. The following code will give you what you want for the last 2 pictures above.
Code

float[] rotationMatrix = new float[9];  

if(SensorManager.getRotationMatrix(rotationMatrix, null, lastAcceleration, lastMagneticField)){
    float[] orientMatrix = new float[3];
    float remapMatrix = new float[9];
    SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, remapMatrix);
    SensorManager.getOrientation(remapMatrix, orientMatrix);

    orientation = orientMat[0]*180/(float)Math.PI;
}  

The getOrientation gives you the correct values assuming the phone is laying flat. Thus, if the phone is held vertically, then you have to remap coordinate to get the flat position. Geometrically, you project the the phone -z axis down to the world xy plane and then calculate the angle between this projection vector and the world y-axis.

Upvotes: 7

Code Commander
Code Commander

Reputation: 17290

It seems that the appropriate way to get the bearing when the user is holding the phone vertically is to use something like this:

// after calling getRotationMatrix pass the rotationMatix below:
SensorManager.remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR);

If you want to handle both ways (vertical and flat) you will probably need to detect that and then only perform this remap when it is vertical.

See the API Documentation here.

Upvotes: 1

PeterGriffin
PeterGriffin

Reputation: 910

This looks pretty tricky. I'm developping a PNS for Android and am facing a somehow similar problem for which I'm still in need of the light : How to get the rotation between accelerometer's axis and motion vector?

Thing is that it looks to me absolutely impossible to find which direction the user is facing (not the device) if he's not moving. That is the human has no sensor on his body, so what if the device stayed in the absolute same position but the user rotated by 90° ? I don't see any way to find this.

What I can suggest (and that fits actually to my problem) is that you could (I don't know what you actually do in your code) use the motion of the user to determine his heading. Let me explain. Let's say you have a first position A. User goes to B. Then you can build the AB vector and get the heading of the user when he stops at B. You'd then have to limit your code to the direction he is facing when arriving to destination.

I know this is not as good as what Google Maps gets, but do you know what Google uses for this ? I mean do they only use accelero and mag.field sensors ?

Upvotes: 0

Related Questions