Reputation: 77
I was running the program in Jupyter notebook.
import numpy as np
import scipy as sp
import matplotlib.pyplot as plt
f=sp.fromfile(open("RSM_07_02_2019_lpf_v1"),dtype=sp.float32)
samp_250000=f[1:350001]
samp_250000_reshp=np.reshape(samp_250000,[7,50000])
unit_250000=np.ones([50000,50000])
cmul=(1/35000)*np.matmul(unit_250000,np.transpose(samp_250000_reshp))
diff=np.transpose(samp_250000_reshp)-cmul
cov=np.matmul(np.transpose(diff),diff)
print(cov)
An error is coming when it is run
MemoryError Traceback (most recent call last)
<ipython-input-5-ab4aadfd66e1> in <module>()
6 samp_250000=f[1:350001]
7 samp_250000_reshp=np.reshape(samp_250000,[7,50000])
----> 8 unit_250000=np.ones([50000,50000])
9 cmul=(1/35000)*np.matmul(unit_250000,np.transpose(samp_250000_reshp))
10 diff=np.transpose(samp_250000_reshp)-cmul
~/anaconda3/lib/python3.7/site-packages/numpy/core/numeric.py in ones(shape, dtype, order)
201
202 """
--> 203 a = empty(shape, dtype, order)
204 multiarray.copyto(a, 1, casting='unsafe')
205 return a
MemoryError:
what could be the possible reasons?
Upvotes: 1
Views: 14907
Reputation: 2310
I think the best way to open a "bulky" notebook is to first get rid of all stdout
blocks in the notebook. I would do it using the following command:
jupyter nbconvert --ClearOutputPreprocessor.enabled=True --inplace example.ipynb
This will be relevant if you have a notebook with important information but you cannot open it. If you want to run a bulky query/command, you can increase the memory of Jupyter notebook manually in the config
, or clear the kernel
.
Upvotes: 1
Reputation: 196
A tip I came across on a Kaggle forum
Just another tip, be sure if you are using ipython/jupyter notebooks, make sure that the current notebook is the only running notebook. Close all other applications as well.
Worked for me, I had a couple of notebooks running in the background and shutting them down cleared up enough space.
Upvotes: 1