hhh
hhh

Reputation: 52840

A^k for matrix multiplication in R?

Suppose A is some square matrix. How can I easily exponentiate this matrix in R?

I tried two ways already: Trial 1 with a for-loop hack and Trial 2 a bit more elegantly but it is still a far cry from Ak simplicity.

Trial 1

set.seed(10)
t(matrix(rnorm(16),ncol=4,nrow=4)) -> a 
for(i in 1:2){a <- a %*% a}

Trial 2

a <- t(matrix(c(0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0),nrow=4))
i <- diag(4) 
(function(n) {if (n<=1) a else (i+a) %*% Recall(n-1)})(10)

Upvotes: 13

Views: 15959

Answers (6)

mel
mel

Reputation: 105

Simple solution

`%^%` <- function(A, n) {
  A1 <- A
  for(i in seq_len(n-1)){
    A <- A %*% A1
  }
  return(A)
} 

Upvotes: 0

Benjamin
Benjamin

Reputation: 471

Indeed the expm's package does use exponentiation by squaring.

In pure r, this can be done rather efficiently like so,

"%^%" <- function(mat,power){
    base = mat
    out = diag(nrow(mat))
    while(power > 1){
        if(power %% 2 == 1){
            out = out %*% base
        }
        base = base %*% base
        power  = power %/% 2
    }
    out %*% base
}

Timing this,

m0 <- diag(1, nrow=3,ncol=3)
system.time(replicate(10000, m0%^%4000))#expm's %^% function
user  system elapsed 
0.31    0.00    0.31 
system.time(replicate(10000, m0%^%4000))# my %^% function
user  system elapsed 
0.28    0.00    0.28 

So, as expected, they are the same speed because they use the same algorithm. It looks like the overhead of the looping r code does not make a significant difference.

So, if you don't want to use expm, and need that performance, then you can just use this, if you don't mind looking at imperative code.

Upvotes: 4

flodel
flodel

Reputation: 89057

If A is diagonizable, you could use eigenvalue decomposition:

matrix.power <- function(A, n) {   # only works for diagonalizable matrices
   e <- eigen(A)
   M <- e$vectors   # matrix for changing basis
   d <- e$values    # eigen values
   return(M %*% diag(d^n) %*% solve(M))
}

When A is not diagonalizable, the matrix M (matrix of eigenvectors) is singular. Thus, using it with A = matrix(c(0,1,0,0),2,2) would give Error in solve.default(M) : system is computationally singular.

Upvotes: 14

IRTFM
IRTFM

Reputation: 263342

Although Reduce is more elegant, a for-loop solution is faster and seems to be as fast as expm::%^%

m1 <- matrix(1:9, 3)
m2 <- matrix(1:9, 3)
m3 <- matrix(1:9, 3)
system.time(replicate(1000, Reduce("%*%" , list(m1,m1,m1) ) ) )
#   user  system elapsed 
#  0.026   0.000   0.037 
mlist <- list(m1,m2,m3)
m0 <- diag(1, nrow=3,ncol=3)
system.time(replicate(1000, for (i in 1:3 ) {m0 <- m0 %*% m1 } ) )
#   user  system elapsed 
#  0.013   0.000   0.014 

library(expm)  # and I think this may be imported with pkg:Matrix
system.time(replicate(1000, m0%^%3))
# user  system elapsed 
#0.011   0.000   0.017 

On the other hand the matrix.power solution is much, much slower:

system.time(replicate(1000, matrix.power(m1, 4)) )
   user  system elapsed 
  0.677   0.013   1.037 

@BenBolker is correct (yet again). The for-loop appears linear in time as the exponent rises whereas the expm::%^% function appears to be even better than log(exponent).

> m0 <- diag(1, nrow=3,ncol=3)
> system.time(replicate(1000, for (i in 1:400 ) {m0 <- m0 %*% m1 } ) )
   user  system elapsed 
  0.678   0.037   0.708 
> system.time(replicate(1000, m0%^%400))
   user  system elapsed 
  0.006   0.000   0.006 

Upvotes: 11

WAF
WAF

Reputation: 1151

A shorter solution with eigenvalue decomposition:

"%^%" <- function(S, power)
         with(eigen(S), vectors %*% (values^power * t(vectors)))

Upvotes: 2

Ben Bolker
Ben Bolker

Reputation: 226182

The expm package has an %^% operator:

library("sos")
findFn("{matrix power}")
install.packages("expm")
library("expm")
?matpow
set.seed(10);t(matrix(rnorm(16),ncol=4,nrow=4))->a
a%^%8

Upvotes: 12

Related Questions