gurshi
gurshi

Reputation: 61

Sharing modules across python processes

We are writing a system where memory is tight. There are multiple python processes that will import the same set of classes. If the processes load a gazillion classes and each process consumes a couple hundred megs, then 10 processes and I am running into gigs. Is there a way to "share" class imports across python processes?

Summary:

import classA # Process 1 : loads classA in memory
import classA # Process 2 : reuses what was loaded previously by Process 1

PS: What I'm trying to achieve is something that you would do with .so modules in C/C++.

Upvotes: 6

Views: 1233

Answers (2)

nneonneo
nneonneo

Reputation: 179687

It's possible to save at least a bit of memory if your OS supports an efficient copy-on-write fork (many do). Just import everything once in the parent, then fork() to create all the children.

Note that you won't save anything if you have many small objects, as the objects' refcounts will need to be written to. However, you could see substantial savings with large static structures like attribute dictionaries.

Upvotes: 2

CrazyCasta
CrazyCasta

Reputation: 28362

I don't believe this is possible in the design of python. You might want to uses threads instead of separate processes if this is a big deal. Also, if you're running 100 Megs just by importing several modules you might be doing something wrong in the modules (seems like quite a bit of memory to be using).

One possibility if you absolutely must import everything, can't cut down on memory usage and can't use threads would be to move the big memory code into python extensions written in C/C++. This would then allow you to share the common segments of the shared library your python extensions are in across processes. It also might reduce the memory footprint somewhat.

P.S. In python you're actually importing modules, not classes. Similar to how in C/C++ you're including header files, not classes.

Upvotes: 1

Related Questions