martianwars
martianwars

Reputation: 6500

Efficient way to partially read large numpy file?

I have a huge numpy 3D tensor which is stored in a file on my disk (which I normally read using np.load). This is a binary .npy file. On using np.load, I quickly end up using most of my memory.

Luckily, at every run of the program, I only require a certain slice of the huge tensor. The slice is of a fixed size and its dimensions are provided from an external module.

What's the best way to do this? The only way I could figure out is somehow storing this numpy matrix into a MySQL database. But I'm sure there are much better / easier ways. I'll also be happy to build my 3D tensor file differently if it will help.


Does the answer change if my tensor is sparse in nature?

Upvotes: 37

Views: 27239

Answers (1)

Aaron
Aaron

Reputation: 11075

use numpy.load as normal, but be sure to specify the mmap_mode keyword so that the array is kept on disk, and only necessary bits are loaded into memory upon access.

mmap_mode : {None, ‘r+’, ‘r’, ‘w+’, ‘c’}, optional If not None, then memory-map the file, using the given mode (see numpy.memmap for a detailed description of the modes). A memory-mapped array is kept on disk. However, it can be accessed and sliced like any ndarray. Memory mapping is especially useful for accessing small fragments of large files without reading the entire file into memory.

The modes are described in numpy.memmap:

mode : {‘r+’, ‘r’, ‘w+’, ‘c’}, optional The file is opened in this mode: ‘r’ Open existing file for reading only. ‘r+’ Open existing file for reading and writing. ‘w+’ Create or overwrite existing file for reading and writing. ‘c’ Copy-on-write: assignments affect data in memory, but changes are not saved to disk. The file on disk is read-only.

*be sure to not use 'w+' mode, as it will erase your file's contents.

Upvotes: 53

Related Questions