Katia
Katia

Reputation: 1

VAR(1) rolling window (Vector autoregression)

Can somebody help me to run in R a VAR(1) (vector autoregression) with a rolling window on a multiple time series AND somehow store the Bcoef (coefficients) and residuals? It seems like I cannot figure out a way to do it all at once.

My code: (using packages library(vars) for vector autoregressions

varcoef <- function(x) Bcoef(VAR(x, p=1, type =c("const"), lag.max = NULL))
varr <-  function(x) resid(VAR(x, p=1, type =c("const"), lag.max = NULL))
rolling.var.coef <-  rollapply(eur.var,width=120,varcoef, by.column=FALSE)
var.resids<-as.data.frame(rollapplyr(eur.var,width=120,varr, by.column=FALSE))

the are two problems with this approach:

Here is the approximate view of my data - 3000 days like this.

V1    V2    V3    V4    V5    V6   V7
2016-05-10 -0.34 -0.35 -0.37 -0.40 -0.41 -0.30 0.14
2016-05-09 -0.36 -0.35 -0.37 -0.40 -0.41 -0.30 0.15  

Upvotes: 0

Views: 2703

Answers (1)

Luks
Luks

Reputation: 143

So, try something in these lines (approach is borrowed from pieces of code of the package frequencyConnectedness).

library(vars)

data(Canada)
data <- data.frame(Canada)
window <- 10

# your VAR function, saving both matrices in a list
caller <- function(j) {
  var.2c <- VAR(data[(1:window)+j,],p=1,type = "const")
  B <- Bcoef(var.2c)
  r <- resid(var.2c)
  list(B,r)
}

# Roll the fn over moving windows
out <- pbapply::pblapply(0:(nrow(Canada)-window), caller)

The beauty here is that with large and more time consuming functions (such as SVARs) you can go parallel.

Parallel computations using linux/mac

For example, on linux/mac systems this should make your computer's life easier (different story for windows, see link above, and solution below):

library(vars)
library(pbapply)

data(Canada)
data <- data.frame(Canada)
window <- 10

caller <- function(j) {
  var.2c <- VAR(data[(1:window)+j,],p=1,type = "const")
  B <- Bcoef(var.2c)
  r <- resid(var.2c)
  list(B,r)
}

# Calculate the number of cores and define cluster
no_cores <- detectCores() - 1
cluster <- makeCluster(no_cores, type ="FORK")

out <- pbapply::pblapply(0:(nrow(Canada)-window), caller, cl = cluster)

stopCluster(cluster)

Parallel computations using windows

# Calculate the number of cores and create PSOCK cluster
no_cores <- detectCores() - 1
cluster <- makeCluster(no_cores)

# Export necessary data and functions to the global environment of the cluster workers 
# and necessary packages on the cluster workers
clusterExport(cluster, c("Canada","data","window","caller"), envir=environment())
clusterEvalQ(cluster, library(vars))

#Moving window estimation 
out <- pblapply(0:(nrow(Canada)-window), caller,cl = cluster)

stopCluster(cluster)

Upvotes: 1

Related Questions