Reputation: 31
I am doing some analyses using vscode on a remote server that has got SLURM installed to manage jobs and provide parallel computing. I would like to run each cell in the Jupyter notebook as an interactive job on SLURM the same way my command line code would be run as an interactive SLURM job after I have used srun
to request compute nodes. The jobs I need to run on the Jupyter notebook require a lot of memory, so I need to run them using SLURM.
My current work around is to run srun
on the terminal and start a python terminal, then I copy and paste the code from each cell of my notebook into the python terminal. I'd really appreciate your help.
Upvotes: 2
Views: 2517
Reputation: 11
A well-suited solution to this challenge is utilizing the Submitit library (github repository). This library is designed to interface with SLURM, facilitating the submission of Python functions as jobs from your notebooks.
A pretty simple test example taken from Oakland University page would be:
import submitit
import sys
import os
def primes(nprimes):
os.system('module load Python')
n = nprimes
for p in range(2, n+1):
for i in range(2, p):
if p % i == 0:
break
else:
print (p),
print ('Done')
log_folder = "log_test/%j"
executor = submitit.AutoExecutor(folder=log_folder)
executor.update_parameters(slurm_job_name="PrimesTest", tasks_per_node=1,
nodes=1, gpus_per_node=1, timeout_min=300, slurm_partition="defq")
job = executor.submit(primes, 1000000)
print(job.job_id) # ID of your job
output = job.result()
Upvotes: 1
Reputation: 36
It is an old question, but answering as I also came across this problem recentrly.
After you do srun
on a terminal, you should be able to ssh
directly into your compute node in VScode and use all the capabilities of the compute node in the interactive mode/notebook
The steps I take, for example, are:
srun
into a nodeconfig
file, so that you can ssh into itUpvotes: 1