Reputation: 9806
How would it be possible to run the following the NetLogo simulations in parallel.
library(RNetLogo)
path.to.NetLogo <- "C:/Program Files (x86)/NetLogo 5.1.0" #change this path to your Netlogo directory
NLStart(path.to.NetLogo, nl.version=5)
#open specific model from NetLogo then.
while(i < 0.123)
{
NLCommand("set beta-exit", i)
NLCommand("setup");
a=NLReport("count inboxturtles with [exit = true]");
NLCommand ("go");
e=((NLReport("total-time"))/a)
i=i+0.009;
}
Consider that, this statement:
NLCommand ("go");
requires the most time to execute and should be run in parallel. I hoping to do it somehow without opening multiple instance of NetLogo.
TO MAKE QUESTION MORE CLEAR:
Premise: Behaviour Space runs NetLogo simulations in parallel.
Objective: To use the same NetLogo instance started from R and run the simulations of while loop in parallel.
Upvotes: 2
Views: 563
Reputation: 980
I assume you want to run an experiment, varying the value of your parameter beta-exit
and using all available cores on your computer in parallel. From R this means opening multiple instances of the same NetLogo model, each running on a different core (which is slightly different from your stated objective).
Jan Thiele, the creator of the RNetLogo-package, has actually written a vignette about this (Link).
In your case, varying only one parameter, his example-code should be exactly what you want. Here it is with some adaptations for your question:
gui <- TRUE
nl.path <- "C:/Program Files (x86)/NetLogo 5.1.0"
model.path <- "C:/..."
## To start NetLogo and open desired model
prepro <- function(gui, nl.path, model.path) {
library(RNetLogo)
NLStart(nl.path, gui=gui)
NLLoadModel(model.path)
}
## simulation function
simfun <- function(i_value) {
NLCommand("set beta-exit", i_value)
NLCommand("setup")
a <- NLReport("count inboxturtles with [exit = true]")
NLCommand ("go")
e <- (NLReport("total-time"))/a
ret <- data.frame(count = a, time = e)
return(ret)
}
## To close NetLogo
postpro <- function(x) {
NLQuit()
}
library(parallel)
processors <- detectCores()
cl <- makeCluster(processors, outfile="./log.txt")
# Logfile in working directory, oftentimes helpful as there is no console output
## Extension: If you define your own functions that are to be called
## from within the simulation, they need to be made known to each of the cores
clusterExport(cl, list("own_function1", "own_function1"))
## load NetLogo on each core
invisible(parLapply(cl, 1:processors, prepro, gui=gui,
nl.path=nl.path, model.path=model.path))
## re-set working directory for each cluster (relevant for logfile).
## There's probably a more elegant way to do this, but it gets the job done.
clusterEvalQ(cl, setwd("C:/DESIRED_WD"))
## create vector of beta-exit values
i <- seq(0.006, 0.123, 0.009)
## run simulations
result.par <- parSapply(cl, i, simfun)
invisible(parLapply(cl, 1:processors, postpro))
stopCluster(cl)
You might also want to check out other functions for parallel computing in the snow-package that can be used instead of parSapply()
.
Upvotes: 3