Reputation: 129
I am developing an app for both Android and iOS, which works as a virtual assistant for car drivers. This app can predict possible collisions or accidents making use of the phone's accelerometer and gyroscope.
What I am searching for is an algorithm that can detect when the car made a drift. That means, when the car made a turn at a very high speed or it started going on circles. Basically what may cause an accident.
The closest answer I've found is this one Algorithm to detect left or right turn from x,y co-ordinates
However, that would only tell me if the car has made a turn, but I would also need to know the speed that turn was made in order to know if it was a drift or not.
My app calculates both the XYZ acceleration and XYZ position of the phone every 500 ms, so how could I detect, using that information, that the vehicle has made a drift?
Sorry for not giving much information about it, I am trying to figure it out how I can do it but I have not found anything that could answer this question.
Upvotes: 0
Views: 600
Reputation: 291
Some simple ideas
If you only need the speed then just compute it by velocity=space/time
To know if you are taking a close curve, you can compare the velocity and the acceleration vector, if they differ by a significant angle and the acceleration is large, then you are taking a close curve (you have a lot of centrifugal force). You can compute this no matter the orientation of the phone.
For drifting (sliding in one direction) you would have to know where the front of the car is as Daniel suggests, because in some situations, like sliding on ice, you could be going at almost constant speed with no acceleration and the phone would not be able to tell you are sliding. So, you either fix the position of the phone, or do some algorithm for the phone to "learn" where the front of the car is based on previous measurements.
So, if (x,y,z) is the position and (ax,ay,az) the acceleration, you can compute the norm of the acceleration and the angle between them as:
dt=500
vx=(x[0]-x[1])/dt
vy=(y[0]-y[1])/dt
vz=(z[0]-z[1])/dt
normv=Math.sqrt(vx*vx+vy*vy+vz*vz)
norma=Math.sqrt(ax[0]*ax[0]+ay[0]*ay[0]+az[0]*az[0])
angle=Math.acos( (vx*ax[0]+vy*ay[0]+vz*az[0])/(norma*normv))
Usually is better not to use directly the position and the acceleration, but to use their running average of a few measurements in order to smooth the readings.
Upvotes: 0
Reputation: 7724
Well, maybe I'm wrong, but it looks a bit simple on paper. A drift looks like this:
Basically, it happens when the real velocity vector (represented as R) and the forward (or look) vector (represented as L) are far from each other at a considerable angle θ.
So, in order to detect a drift, I would set a "maximum accepted θ". If the angle goes over it, it means the car is drifting.
The real velocity R can be measured by subtracting the current position by the previous position and dividing the result by the time between them.
Upvotes: 0