Reputation: 470
I want to create a unique directory for each Slurm job I run. However, mkdir
appears to interrupt SBATCH
commands. E.g. when I try:
#!/bin/bash
#SBATCH blah blah other Slurm commands
mkdir /path/to/my_dir_$SLURM_JOB_ID
#SBATCH --chdir=/path/to/my_dir_$SLURM_JOB_ID
touch test.txt
...the Slurm execution faithfully creates the directory at /path/to/my_dir_$SLURM_JOB_ID
, but skips over the --chdir
command and executes the sbatch script from the working directory the batch was called from.
Is there a way to create a unique directory for the output of a job and set the working directory there within a single sbatch script?
Upvotes: 3
Views: 5896
Reputation: 59110
First off, the #SBATCH
options must be at the top of the file, and citing the documentation
before any executable commands
So it is expected behaviour that the --chdir
is not honoured in this case. The issue rationale is that the #SBATCH
options, and the --chdir
in particular, is used by Slurm to setup the environment in which the job starts. That environment must be decided before the job starts, and cannot be modified afterwards by Slurm.
For similar reasons, environment variables are not processed in #SBATCH
options ; they are simply ignored by Bash as they are in a commented line, and Slurm makes no effort to expand them itself.
Also note that --chdir
is used to
Set the working directory of the batch script to directory before it is executed.
and that directory must exist. Slurm will not create it for you.
What you need to do is call the cd
command in your script.
#!/bin/bash
#SBATCH blah blah other Slurm commands
WORKDIR=/path/to/my_dir_$SLURM_JOB_ID
mkdir -p "$WORKDIR" && cd "$WORKDIR" || exit -1
touch test.txt
Note the exit -1
so that if the directory creation fails, your job stops rather than continuing in the submission directory.
As a side note, it is always interesting to add a set -euo pipefail
line in your script. It makes sure your script stops if any command in it fails.
Upvotes: 4