Nikita Vlasenko
Nikita Vlasenko

Reputation: 4352

Submit new jobs with snakemake when previous ones are not yet finished on SLURM cluster

I am running Snakemake on SLURM cluster and I have such a problem: the cluster allows me to submit only a number (around 20) jobs at a time. After running snakemake.sh which is:

#!/bin/bash

INPUT_DIR=...

snakemake -j 190 --latency-wait 1000 --cluster-config cluster.json --
cluster "sbatch -A {cluster.A} -p {cluster.p} -t {cluster.time} --
output {cluster.output} --error {cluster.error} --nodes {cluster.nodes}
--ntasks {cluster.ntasks} --cpus-per-task {cluster.cpus} --mem {cluster.mem}"

20 jobs are run (not 190) and so I end up waiting while all of the 20 finish up and then rerunning the script. This is not optimal obviously. Lets' say that 15 jobs completed but 5 are still running, is there a way to submit additional 15 somehow?

Upvotes: 0

Views: 269

Answers (1)

dariober
dariober

Reputation: 9062

A couple of thoughts...:

  • Are you sure that additional jobs can be submitted before previous ones have finished? For example, it may be that downstream jobs require as input the file(s) produced by the previous 20 jobs. This may be the case for a rule that merges files.

  • You say the cluster allows me to submit only a number (around 20) jobs at a time. Maybe check that the issue is with the cluster rather than with snakemake. Try submitting a bunch of dummy jobs and see if slurm accepts them in the queue,

Like (not tested, just get the idea):

for i in {1..30}
do
    sbatch --wrap "sleep 30 && touch test${i}.tmp" 
done

Upvotes: 1

Related Questions