MH Ng
MH Ng

Reputation: 35

Large array creating MemoryError in numpy

Is there any resolution to the problem of memory error?

The memory is 8 GB, and I want to complete the broadcasting operation, but it did have a huge cost.

Given the condition:

interdata = data[:, None] - data[None]

The data is with shape [1000, 32, 32], and I want the interdata with a shape of [1000, 1000, 32, 32], and that seems to be too large for my memory.

Is there a method to solve the difficulty?

Upvotes: 1

Views: 91

Answers (1)

user3666197
user3666197

Reputation: 1

Is there a method to solve the difficulty?

Sure, but at a cost . . .

numpy can use a .memmap()-ed resource, where a disk-capacity is your next capacity "ceiling".

Yet, this will get you into units of [us] for SSD-device hosted .memmap()-storage or ~1E+1 [ms] for spinning drives, so bear that in mind.

If this is not acceptable, there are reasonably expensive COTS-platforms today providing multi-TB capacities or RAM, where [ns]-rule.

Upvotes: 1

Related Questions