Reputation: 35
I have a functional snakemake profile for running jobs on the HPC using SLURM. If I run this script locally, the jobs are submitted in parallel and everything works as expected.
However, this pipeline takes a while, and I have to VPN into a protected environment to work with this data, so I cannot keep my terminal open forever as the VPN has a timeout.
So I would like to use an .sbatch script that can handle this for me. Minimally, something like:
#!/bin/bash
#SBATCH --account=<acct>
#SBATCH --partition=<part>
#SBATCH --time=48:00:00
#SBATCH --nodes=1
#SBATCH --ntasks=1
#SBATCH --output=./logs/log.out
#SBATCH --error=./logs/log.out
# Move to snakemake directory.
cd path/to/workdir
# Make logs directory.
mkdir -p logs
# Load environment.
ml miniconda3/latest
module use ~/MyModules
source /path/to/miniconda/etc/profile.d/conda.sh
conda activate snakemake
# Run preprocessing pipeline.
snakemake --slurm --profile profile/cluster/
However, when I do this, my jobs fail with the error messages:
srun: fatal: SLURM_MEM_PER_CPU, SLURM_MEM_PER_GPU, and SLURM_MEM_PER_NODE are mutually exclusive.
This error does not happen when I launch snakemake right from the terminal, so the error shouldn't be in the profile itself.
How can I use an sbatch script to host the rest of the slurm job submissions?
Upvotes: 0
Views: 1673
Reputation: 21
have a look here!!!
apparently just assigning the '--mem=2G' to the sbatch job solves the issue have tested it!!!
https://github.com/snakemake/snakemake/issues/2230
Upvotes: 1
Reputation: 31
Although this is not what you are asking exactly, I still think this might help you.
If the only problem is termination of ssh session, then I would suggest to start your workflow under a new terminal session with screen
.
First, connect to your remote server.
Open a new session
screen -S your_chosen_name
Activate environment, start
conda activate env_name;
snakemake -j 48 ....
Now, if your ssh session will be terminated or you will close it by yourself, this task will be running under another terminal session. To reattach to it:
screen -r your_chosen_name
Upvotes: 0