Reputation: 41
Can Upper and lower limit constraints be applied to the estimates updated by kalman filter?
I have one of the states which can have only non negative values in practical life. When I apply Kalman Filter, this state is updated to have negative values instead. How can I apply this limit constraint in kalman filter?
Please reply
Thanks
Upvotes: 3
Views: 2566
Reputation: 347
The other answers are more general and better suited to answer most problems. One realization I had when approaching this problem was that, even though one of my states is constrained, the dynamics and measurement updates were such that the state would actually never violate the constraints. It's a bit of a special case, but my state is a static state that has no correlation with other states so can be treated as an isolated one-state system.
In my case, this ensures that the resulting state remains within the constraints (Kalman can be considered a form of interpolation). If you want to constrain the covariance as well, that's a different story.
Upvotes: 0
Reputation: 3493
This can actually be implemented very trivially in every linear or nonlinear Kalman filter formulation: just do min(xmax, max(xmin, x_plus))
in every time step, where x_plus
is your state estimate after the update step. While this may sound like a very bad hack losing all of a Kalman filters nice properties etc., it is actually nicely justified by theory. For details, refer to D. Simon (2010), "Kalman filtering with state constraints: a survey of linear and nonlinear algorithms", IET. Simon discusses the case of general linear inequality constraints which is more complex, but in the case of a simple bounding box the "state projection method" reduces to the above operation.
Upvotes: 6
Reputation: 121
Some common ways to bound state variables in an EKF (sorry, no KF) are the following:
If x
is the state and we augment it to the bounded state x'
.
To force a bound such as x > a
we define a new state x' = exp(x) + a
.
This will never less than a as exp(x) -> 0
as x -> -inf
. This also holds for negative constraints where x < -a
=> x' = - exp(x) - a
If you want to have a low/upper bound on a state such as a < x < b
:
This can be achieved with a Sigmoid function, the most popular (in my experience) is x' = tanh(x)
which will bound x to (-1, 1). Now it is a simple step to generalize to a < x < b
as we need to scale it to the appropriate size and offset it as x' = tanh(x)*(b-a)/2 + (a+b)/2
. So when tanh(x) = -1
we get a
and when tanh(x) = 1
we get b
, successfully implementing the desired bound.
And this usually covers the majority of bounds and their respective derivatives are acceptable. Hope it helps!
Upvotes: 1
Reputation: 93720
It's easy to do this in a UKF (unscented Kalman filter) by simply constraining the sigma points (those are the points you generate that approximate the Gaussian distribution of your state with errors taken into account).
For an EKF you can find papers about how to project the state space back onto the constraint boundary. The complexity comes from two factors:
Upvotes: 2
Reputation: 655
One mechanism is to create an artificial measurement. In this case, if the estimate falls below zero, process a 'measurement' to pull it in. So if the estimate is -x, process an artificial measurement of 'x' with a measurement noise value of x^2. One obvious drawback is the distortion of the covariance estimate for the state variable.
A second method it to transform the state-vector and covariance matrix to a space where the constraint does not exist. For example, the filter could be operating in logarithmic space. The filter can produce positive or negative values, but when converted back to the normal space by the transform x' = e^x, all values are positive. This precludes generating a negative estimate, but of course the covariance matrix is now a description of the second-order statistics in logarithmic space, which may not be an accurate representation of the true statistics. To move to the other space for the filter, all the model matrices must be transformed, transition matrix, measurement Jacobian, and process and measurement noise matrices.
Upvotes: 1