pedro
pedro

Reputation: 463

How to send jobs on several nodes?

I have a R code that I would like to execute on several nodes (SLURM). This is the code :

...
res<-foreach(i = seq_len(nrow(combs))) %dopar% {
 G1 <- split[[combs[i,1]]]
 G2 <- split[[combs[i,2]]]
 bind <- cbind(data[,G1], data[,G2])
 rho.i <- cor_rho(bind)     #the function cor_rho I created  
 }
...

Actually, it is a comparison of several submatrices where I compute the correlation between each of them, with my function cor_rho. I want to execute each comparison on a node of the cluster.

Any help?

Bests

Upvotes: 1

Views: 155

Answers (1)

dvitsios
dvitsios

Reputation: 458

You need to call your R script from a bash script, which you will then submit to SLURM requesting a certain number of cores.

If your R code is within a r_analysis.R file, you can create a submit_job.sh file like that:

#! /bin/bash

Rscript r_analysis.R

Finally, you submit your job (e.g. from command line) requesting a certain number of CPUs (8 in this example) and memory/CPU (4G in this example):

sbatch --mem-per-cpu=4G --cpus-per-task=8 ./submit_job.sh

Upvotes: 1

Related Questions