Reputation: 3607
I'm using numpy to create a cube array with sides of length 100, thus containing 1 million entries total. For each of the million entries, I am inserting a 100x100 matrix whose entries are comprised of randomly generated numbers. I am using the following code to do so:
import random
from numpy import *
cube = arange(1000000).reshape(100,100,100)
for element in cube.flat:
matrix = arange(10000).reshape(100,100)
for entry in matrix.flat:
entry = random.random()*100
element = matrix
I was expecting this to take a while, but with 10 billion random numbers being generated, I'm not sure my computer can even handle it. How much memory would such an array take up? Would RAM be a limiting factor, i.e. if my computer doesn't have enough RAM, could it fail to actually generate the array?
Also, if there is a more efficient to implement this code, I would appreciate tips :)
Upvotes: 9
Views: 17122
Reputation: 5877
for the "inner" part of your function, look at the numpy.random module
import numpy as np
matrix = np.random.random((100,100))*100
Upvotes: 2
Reputation: 154484
A couple points:
cube.dtype
is int64
, and it has 1,000,000 elements, it will require 1000000 * 64 / 8 = 8,000,000
bytes (8Mb).cube
, element = matrix
will simply overwrite the element
variable, leaving the cube
unchanged. The same goes for the entry = random.rand() * 100
.Upvotes: 23