Dail
Dail

Reputation: 4602

How to do linear regression with Matrix?

i have a problem doing the linear regression with three Matrix objects.

m1 = matrix(c(1:10))
m2 = matrix(c(10:19))
m3 = matrix(c(100:109))

I DO NOT have problem if i do:

mod = lm(m1+m2 ~ m3+0)

I have the problem if i only use TWO Matrixs, like:

m1 = matrix(c(1:20), ncol=2)
m2 = matrix(c(1:10))

mod = lm(m1 ~ m2+0)

in this case i get TWO coefficients for m2:

Coefficients:
    [,1]   [,2] 
m2  1.000  2.429

but I do not want it I would like that the two columns of the m1 matrix will be as the previous example (like two distinct columns)

How to do it?

Upvotes: 1

Views: 8986

Answers (1)

Seth
Seth

Reputation: 4795

In your first example you are summing your two column vectors row-wise together and using that as the target. For the matrix m1 I think you want the rowsums as the predictor.

like:

m1 = matrix(c(1:2000), ncol=200)

m2 = matrix(c(1:10))

msum=apply(m1,1,sum)

now use msum for your response.

mod = lm(msum ~ m2+0)

This gives just one coeficient. I think this is what you want, but I am still not sure why you would want this?

Upvotes: 2

Related Questions