Random_Astro_Student
Random_Astro_Student

Reputation: 13

Using slurm array jobs

I have a jobscript that runs a certain python script with arguments provided with argparsing. I know that when I want to run this job, I will almost always want to submit not just one job, but several in parallel with different values of one particular argument. I have read several articles on using slurm arrays for this, by setting for instance: #SBATCH --array=1-10 at the top of the jobscript, and then accessing the values of the array using $SLURM_ARRAY_TASK_ID in the call the python script. However, I can't seem to find a way to make the array run over any arbitrary array of numbers. For a concrete example, say I would like to use the values [0.1, 0.75, 3, 25.5, 50]. The numbers can always be increasing, but aren't necessarily related mathematically in any other way. I know I could write a separate file and read these values in based on the row number of that file, but that seems like overkill.

How can I use #SBATCH --array with arbitrary and non-integer array values?

Upvotes: 0

Views: 804

Answers (1)

damienfrancois
damienfrancois

Reputation: 59320

The easiest way is to use an intermediate Bash array:

#!/bin/bash
#SBATCH ...
#SBATCH --array=0-4

ARGS=(0.1 0.75 3 25.5 50)

./script.py ${ARGS[$SLURM_ARRAY_TASK_ID]}

This will effectively submit a 5-job array where each job will run script.py with the value in $ARGS whose index is the $SLURM_ARRAY_TASK_ID

Upvotes: 0

Related Questions