Reputation: 3950
I have a couple of Python/Numpy programs that tend to cause the PC to freeze/run very slowly when they use too much memory. I can't even stop the scripts or move the cursor anymore, when it uses to much memory (e.g. 3.8/4GB) Therefore, I would like to quit the program automatically when it hits a critical limit of memory usage, e.g. 3GB.
I could not find a solution yet. Is there a Pythonic way to deal with this, since I run my scripts on Windows and Linux machines.
Upvotes: 9
Views: 3437
Reputation: 6861
You could limit the process'es memory limit, but that is OS specific.
Another solution would be checking value of psutil.virtual_memory()
, and exiting your program if it reaches some point.
Though OS-independent, the second solution is not Pythonic at all. Memory management is one of the things we have operating systems for.
Upvotes: 5
Reputation: 980
I'd agree that in general you want to do this from within the operating system - only because there's a reliability factor in having "possibly runaway code check itself for possibly runaway behavior"
If a hard and fast requirement is to do this WITHIN the script, then I think we'd need to know more about what you're actually doing. If you have a single large data structure that's consuming the majority of the memory, you can use sys.getsizeof to identify how large that structure is, and throw/catch an error if it gets larger than you want.
But without knowing at least a little more about the program structure, I think it'll be hard to help...
Upvotes: 3