docdev
docdev

Reputation: 894

eigen() and the correct eigenvectors

My problem is the following:

I'm trying to use R in order to compute numerically this problem. So I've correctly setup the problem in my console, and then I tried to compute the eigenvectors.

R program

But I expect that the eigenvector associated with lambda = 1 is (1,2,1) instead of what I've got here. So, the scaling is correct (0.4082483 is effectively half of 0.8164966), but I would like to obtain a consistent result.

My original problem is to find a stationary distribution for a Markov Chain using R instead of doing it on paper. So from a probabilistic point of view, my stationary distribution is a vector whose sum of the components is equal to 1. For that reason I was trying to change the scale in order to obtain what I've defined "a consistent result".

How can I do that ?

Upvotes: 1

Views: 936

Answers (2)

Stéphane Laurent
Stéphane Laurent

Reputation: 84719

The eigen vectors returned by R are normalized (for the square-norm). If V is a eigen vector then s * V is a eigen vector as well for any non-zero scalar s. If you want the stationary distribution as in your link, divide by the sum:

V / sum(V)

and you will get (1/4, 1/2, 1/4).

So:

ev <- eigen(t(C))$vectors
ev / colSums(ev)

to get all the solutions in one shot.

Upvotes: 3

Ben Bolker
Ben Bolker

Reputation: 226911

C <- matrix(c(0.5,0.25,0,0.5,0.5,0.5,0,0.25,0.5),
            nrow=3)
ee <- eigen(t(C))$vectors

As suggested by @Stéphane Laurent in the comments, the scaling of eigenvectors is arbitrary; only the relative value is specified. The default in R is that the sum of squares of the eigenvectors (their norms) are equal to 1; colSums(ee^2) is a vector of 1s.

Following the link, we can see that you want each eigenvector to sum to 1.

ee2 <- sweep(ee,MARGIN=2,STATS=colSums(ee),FUN=`/`)

(i.e., divide each eigenvector by its sum).

(This is a good general solution, but in this case the sum of the second and third eigenvectors are both approximately zero [theoretically, they are exactly zero], so this only really makes sense for the first eigenvector.)

Upvotes: 2

Related Questions