user2147028
user2147028

Reputation: 1291

How to create R worker subprocesses that can perform long task while the parent R process is kept active

Consider a R shiny application that can initiate long calculus (let say one day calculus). I would like this application to transfer the long calculus to a worker R process and to continue to serve other requests. When the calculus is done the shiny application can then access the result though a cache or something equivalent. A generic shiny server.R function would be something like that

shinyServer(function(input, output){

  queryString <- reactive({GetQueryString(input)})
  observe({LanchWorkerProcess(queryString())})
  output$result <- renderText({ GetCachedResult(queryString()) }) 

})   

This kind of pattern is common in some languages such as Node.js. There are significant resources for parallel computing in R (CRAN HighPerformanceComputing), and I have identified some precious building blocks like rRedis and doRedis. Yet I am not sure how to put the building blocks together and I am afraid to be trying to reinvent the wheel.

How could this pattern be implemented in a smart way ?

Upvotes: 1

Views: 344

Answers (1)

Greg Snow
Greg Snow

Reputation: 49640

Look at the Rdsm package, it allows for distributed memory so that multiple R processes can share variables. Use it along with a package for doing parallel processing, like parallel, snow, or Rmpi, to send messages between the processes to run the code.

Upvotes: 1

Related Questions