Reputation: 2712
I am preparing a Jupyter notebook which uses large arrays (1-40 GB), and I want to give its memory requirements, or rather:
The best idea I have is to:
/usr/bin/time -v jupyter notebook
,/usr/bin/time -v ipython notebook.py
Then assume N > C and M > S + C.
I think there must be a better way, as:
Upvotes: 0
Views: 2250
Reputation: 45
The jupyter-resource-usage extension is an extension preinstalled. It tells you how much memory you have and how much you need, but it's only available in the jupyter notebook you find in anaconda installations. It is located in the top-right corner.
Upvotes: 1
Reputation: 78
Your task is hard I think.
You have no guarantee that Python is actually putting in ram every variable. Maybe the OS decided to kick some of the memory on the disk using the swap.
You can try to disable it but there may be other things that cache stuff.
You can force garbage collection somehow in python using the gc
package but results were inconsistent when I tried to do so.
I don't know if M and N are actually useful to you or if you are just trying to size a server. If it's the later, maybe renting increasing size server on AWS or Digital Ocean and running a runtime benchmark may actually give you faster and more reliable results.
Upvotes: 1