Reputation: 2796
I would like to use IPython/Jupyter to set up 100 jobs on a computing cluster to perform some calculations.
In Python, I would do the following
from IPython.parallel import Client
c = Client()
c[:].apply_sync(lambda : "Hello, World")
Or, using ipython-cluster-helper
with cluster_view(scheduler="lsf", queue="myqueue", num_jobs=100) as view:
result = view.map(myfunc, params)
Is it possible to access IPython's parallel abilities from an R kernel?
If yes, how?
There are some parallel processing capabilities in R, but this would need to be a function provided by the IRkernel.
I would expect that if I run the line below (with a corresponding function to access the IPython/Jupyter cluster), I would see parallel execution times:
parallel_access_func(1:4, function(x) { re = as.character(Sys.time()); Sys.sleep(5); re })
Upvotes: 1
Views: 1325
Reputation: 2319
I don't think it is possible like you imagine it: ipyparallel
is a Python parallel execution lib with Jupyter ipykernel
s as backend. It can be more or less used independently from the notebook/IPython.
To make this work with an R kernel, you would need to implement a irparallel
library which would do the same with the irkernel
, e.g. at least a function which interacts with a controller and R engines which start irkernels
instead of ipykernels
. All of this does not exist (yet--but nor is it planned).
There are other R parallel libs (as there are other Python parallel libs), which might do what you want (I don't have any experience with them). Have a look at https://cran.r-project.org/web/views/HighPerformanceComputing.html
Upvotes: 0
Reputation: 2289
It is possible, but the R Kernel would need to be re-implemented using MetaKernel. MetaKernel allows any MetaKernel-based kernel to be run in parallel, using the %parallel and %px magics. It might not be too difficult to reimplement.
Upvotes: 1