Reputation: 1
I searched about poly()
in R and I think it should produce orthogonal polynomials so when we use it in regression model like lm(y~poly(x,2))
the predictors are uncorrelated. However:
poly(1:3,2)=
[1,] -7.071068e-01 0.4082483
[2,] -7.850462e-17 -0.8164966
[3,] 7.071068e-01 0.4082483
I think this is probably a stupid question but what I don't understand is the column vectors of the result poly(1:3,2)
does not have inner product zero? That is -7.07*0.40-7.85*(-0.82)+7.07*0.41=/ 0
? so how is this uncorrelated predictors for regression?
Upvotes: 0
Views: 198
Reputation: 226097
Your main problem is that you're missing the meaning of the e
or "E notation": as commented by @MamounBenghezal above, fff
eggg
is shorthand for fff * 10^(ggg)
I get slightly different answers than you do (the difference is numerically trivial) because I'm running this on a different platform:
pp <- poly(1:3,2)
## 1 2
## [1,] -7.071068e-01 0.4082483
## [2,] 4.350720e-18 -0.8164966
## [3,] 7.071068e-01 0.4082483
An easier format to see:
print(zapsmall(matrix(c(pp),3,2)),digits=3)
## [,1] [,2]
## [1,] -0.707 0.408
## [2,] 0.000 -0.816
## [3,] 0.707 0.408
sum(pp[,1]*pp[,2]) ## 5.196039e-17, effectively zero
Or to use your example, with the correct placement of decimal points:
-0.707*0.408-(7.85e-17)*(-0.82)+(0.707)*0.408
## [1] 5.551115e-17
Upvotes: 3