Reputation: 29
I am stuck here trying to deal with large python files and run each cell after, this one that i am working have ~70,1KB and when i open it it takes a long time waiting localhost and socket be available, and some seconds (sometimes more minutes loading extensions [MathJax]/extensions/Safe.js), when there are lots of outputs in the file it crashes the jupyter notebook, but recently i have closed them through this command in cmd: jupyter nbconvert --ClearOutputPreprocessor.enabled=True --inplace Notebook.ipynb from this link: How to clear an IPython Notebook's output in all cells from the Linux terminal? and so i was possible to open it with less delay, in compensation, to save and checkpoint and interrupt the kernel the process turns more drastic(these buttons keep blue how they are being clicked and nothing happen), and i finish being supposed to copy them in my notepad separatedly and so, run one by one in another Python3 notebook. Do you know any technique or procedure to fix this Jupyter Notebook slowness and make it faster, or it has something to do with the pc performance? Thanks in advance
Upvotes: 2
Views: 3565
Reputation: 1
I think there was a limit to how many cells and their outputs I could have in one jupyter notebook file.
For example for me, I could have kept using the same file if I'd cleared the outputs for the cells or even deleted some of the cells themselves. Some of the cells had close to 100 figures as outputs, and in my case, I didn't want to clear them out.
I ended up continuing my work starting on a new jupyter notebook page.
Upvotes: 0
Reputation: 8931
My performance problems on Windows were mitigated by: greatly increasing pagefile
size, shutting down the kernel of unused notebooks, installing the memory widget to monitor memory usage. Double check http://localhost:8888/tree to verify all notebooks are shutdown.
Upvotes: 1