Reputation: 1
I'm currently working with a project where I use python functions wrapped in a module within a matlab code. The matlab part of the code is a MCMC (monte carlo multi chains) computation, therefore for speeding up the code I'm using a parfor loop on a cluster.
To be more specific the algorithm can be thought as follow:
My problem is that the only way I have for matlab to use the python's defined function is to re-load the python module per each parfor iteration, but as the code work this means also per each chain step (the parfor is nested inside) and there I spend some times.
My question is: is there a smarter-faster way to use python libraries within matlab? (something equivalent to MEX-?-) otherwise, is there a way to "store" the python module infos in each worker at the beginning without the need to reload the module every time I step forward in the outer loop as well?
Any hint will be really really appreciated!! Thanks a lot
Giulia
Upvotes: 0
Views: 517
Reputation: 25160
You might be able to take advantage of parallel.pool.Constant
here. This allows you to set up some "constant" data to be used by multiple iterations of a parfor
loop, even multiple parfor
loops. The linked reference page shows you how to build a parallel.pool.Constant
using a function handle - you probably want that to be a function handle that loads your module.
Upvotes: 0
Reputation: 1448
I believe you're looking for pctRunOnAll. From the Matlab documentation:
This is useful if there are setup changes that need to be performed on all the workers and the client.
You should be able to modify your algorithm with
Upvotes: 0