Reputation: 31040
Say I create an array:
a=np.ones((21600,13,3,128),dtype=complex)
I get a memory error.
If I half the size of the array (n.b this took >10mins to create on my machine):
b=np.ones((10800,13,3,128),dtype=complex)
Its size in GB is:
b.nbytes/1024**3 = 0.803375244140625 GB
This is well below the amount of RAM in my laptop (2GB) - Therefore, I would have assumed that creating 'a' should have worked. What are the limiting factors stopping me from dealing with such big arrays?
Ideally, I would like to create an array of shape (86400,13,3,128)
with dtype=complex
.
Is there any way to do this without splitting it up?
Upvotes: 1
Views: 157
Reputation: 9696
If your laptop has 2GB ram and a
would take 1.6 of this, then chances are pretty high that there's not enough memory. Windows, your browser, mail etc will take quite a bunch of this already.
As an additional complication, numpy will need a single, contiguous 1.6 GB block of memory. Which lowers the chances of having such a big block even more.
If your application/use case permits, it might be worth trying to store your data in a sparse matrix. This only stores non-zero elements - which can save a lot of memory (or not): http://docs.scipy.org/doc/scipy/reference/sparse.html
Upvotes: 1