user3262424
user3262424

Reputation: 7489

Python -- Share a Numpy Array Between Processes?

I'd like to use python's multiprocessing module to utilize a multi-core Linux server.

I need all processes to have read/write access to the same shared memory.

Instead of using a list or a queue, is it possible to have a multi-dimentional numpy array as the shared object?

Upvotes: 12

Views: 1780

Answers (3)

Samuel
Samuel

Reputation: 2490

Look at this. I doesn't seem easy, but it's doable.

Edit: Link rotted, I have linked to another copy.

Upvotes: 6

J.J
J.J

Reputation: 3607

I found that even if you do not modify your numpy array after fork()'ing a bunch of child processes, you will still see your RAM skyrocket as childprocesses copy-on-write the object for some reason.

You can limit (or totally alleviate?) this problem by setting

"yourArray.flags.writeable = False"

BEFORE fork()'ing/Pool()'ing which seems to keep the RAM used down, and is a LOT less hassle than the other methods :)

Upvotes: 1

Robert
Robert

Reputation: 131

I think I know what you're looking for: https://bitbucket.org/cleemesser/numpy-sharedmem/issue/3/casting-complex-ndarray-to-float-in

There's a short description on the web page saying: A shared memory module for numpy by Sturla Molden and G. Varoquaux that makes it easy to share memory between processes in the form of NumPy arrays. Originally posted to SciPy-user mailing list.

I, myself am using it just that way. Sharing NumPy arrays between processes. Works very well for me.

Upvotes: 9

Related Questions