Reputation: 750
Scenario
I have many notebooks opened in a Jupyter Notebook process. I have closed the tabs associated with them from my browser. However, since they're still running in the backend, it takes up a lot of memory.
Question
I was wondering if there is a proper and efficient way of shutting down ALL current or a subset of current opened notebooks without ending the process?
Thanks!
Related Question
How to close IPython Notebook properly?
Upvotes: 1
Views: 4963
Reputation: 25093
The dashboard of Jupyter Notebook usually shows the Files tab.
At the top of the page you can see an horizontal menu of different tabs — click on the Running tab selector.
You now are looking at a list of all the running notebooks (and terminals, if any).
You can click Shutdown on all the notebooks or only on a subset, the kernel associated with each notebook that you have shut down is immediately stopped and the memory it used is immediately released.
Now you can go back at the Files tab.
Upvotes: 1
Reputation: 1398
I use Jupyter Lab most of the time. Most often using the R kernel. I almost never use python any more.
In Windows 10, when I run the Task Manager it always shows multiple Processes of R.exe & Rterm.exe. All are currently showing a Status of Running. There are currently 11 of each R Process Running. On other occasions I've seen many more ... maybe 30 or so.
This used to bother me. My thinking was that these were slowing down the PC. Sometimes I think they do, and sometimes I don't think it matters. I use Bash to start Jupyter, and sometimes I think using Bash might cause this behavior. RStudio doesn't seem to cause this to happen the way Jupyter makes it occur.
Closing the Jupyter files (tabs) does not make these Processes stop. So I worked out a way to stop all the R Processes. Instead of closing one at a time, I learned that closing the python.exe gets rid of all of those R Processes at once. Currently there are 3 python.exe Processes running. These would all be related to the multiple Jupyter Lab files that are open. So "Ending Process" of these python.exe files will definitely Stop all of the R Processes.
But doing this causes Jupyter to take longer on the next startup. And Ending those python.exe files prevents the next Jupyter startup to do so without opening those multiple Jupyter files that were opened when Jupyter last shut down.
Upvotes: 0