pete
pete

Reputation: 93

Least squares problem

After collecting data for my model:

y = b(0) + b(1)x(1) + ... + b(i)x(i)

y = Xb in which y is a column vector (n*1), X is a matrix (n*m), and b (m*1).

I implemented a solution in python using ordinary least squares (OLS) to find b. My problem is that my solution totally depends on whether X is invertible. If X is not, then I cannot estimate b using OLS.

Any suggestions?

Thanks.

Upvotes: 2

Views: 710

Answers (2)

Casey
Casey

Reputation: 487

In case someone finds this later, the matrix X is n x m therefore will never be invertible for n not equal to m. The normal equations for solving OLS will involve the inversion of the matrix X^T * X and as long as X is tall and has linearly independent columns this will always be invertible (usually the case if you have more measurements than variables to estimate).

In the event that X^T * X is not invertible, you will have to make some sort of simplifying assumption. Typically this assumption is (roughly) that weights b are small unless there is enough data to show otherwise. This is captured by transforming the squares problem to get

minimize ||Xb - y||^2 + lambda * ||b||^2

where lambda is a positive scalar. Essentially this penalizes large values of the parameters b. You can give of course make this penalty arbitrarily large or small by making scaling lambda. Rather than the OLS solution

b_ols = inv(X^T * X) * X^T * y

you can do out the math to find the solution to the regularized problem is

b_reg = inv(X^T * X + lambda * I) * X^T * y

The matrix will always be invertible for any positive value of lambda. To find a "good" value of lambda you would typically need to do something like cross validation

Upvotes: 0

duffymo
duffymo

Reputation: 308733

Yes, use SVD (Singular Value Decomposition) to solve that system of equations.

Upvotes: 2

Related Questions