Reputation: 11366
I want to run a program which need substantial time. I want to write a function that can run in parallel (I am graphical interface user in windows). The function divides the task into n sub-tasks and performs a final consensus task. I want to run n task at parallel( same time within same program window) and then combine the outputs. The following just an example:
ptm <- proc.time()
j1 <- cov(mtcars[1:10,], use="complete.obs") # job 1
j2 <- cov(mtcars[11:20,], use="complete.obs") # job 2
j3 <- cov(mtcars[21:32,], use="complete.obs") # job 3
proc.time() - ptm
out <- list (j1 = j1, j2 = j2, j3 = j3)
I know in unix "&" usually allows the jobs to run in background. Is there similar way in R
Upvotes: 5
Views: 7186
Reputation: 32351
You can use mclapply
or clusterApply
to launch several functions in parallel.
They are not really in the background:
R will wait until they are all finished
(as if you were using wait
, in a Unix shell,
after launching the processes in the background).
library(parallel)
tasks <- list(
job1 = function() cov(mtcars[1:10,], use="complete.obs"),
job2 = function() cov(mtcars[11:20,], use="complete.obs"),
job3 = function() cov(mtcars[21:32,], use="complete.obs"),
# To check that the computations are indeed running in parallel.
job4 = function() for (i in 1:5) { cat("4"); Sys.sleep(1) },
job5 = function() for (i in 1:5) { cat("5"); Sys.sleep(1) },
job6 = function() for (i in 1:5) { cat("6"); Sys.sleep(1) }
)
# Using fork()
out <- mclapply(
tasks,
function(f) f(),
mc.cores = length(tasks)
)
# Equivalently: create a cluster and destroy it.
# (This may work on Windows as well.)
cl <- makeCluster( length(tasks) )
out <- clusterApply(
cl,
tasks,
function(f) f()
)
stopCluster(cl)
Upvotes: 7
Reputation: 60924
I have good experience using the plyr
package functions together with a parallel backend created by snow
. In a blog post I describe how to do this. After R 2.14 parallel processing is part of the R core distribution through the parallel
package. I have not tried to let plyr work with a backend generated by parallel
, but I think this should work.
Upvotes: 1