El'endia Starman
El'endia Starman

Reputation: 2244

How do I automatically kill a process that uses too much memory with Python?

The situation: I have a website that allows people to execute arbitrary code in a different language (specifically, an esolang I created), using a Python interpreter on a shared-hosting server. I run this code in a separate process which is given a time limit of 60 seconds.

The problem: You can do stuff like (Python equivalent) 10**(10**10), which rapidly consumes far more memory than I have allotted to me. It also, apparently, locks up Apache - or it takes too long to respond - so I have to restart it.

I have seen this question, but the given answer uses Perl, which I do not know at all, hence I'd like an answer in Python. The OS is Linux too, though.

Specifically, I want the following characteristics:

  1. Runs automatically
  2. Force-kills any process that exceeds some memory limit like 1MB or 100MB
  3. Kills any process spawned by my code that is more than 24 hours old

I use this piece of code (in a Django view) to create the process and run it (proxy_prgm is a Manager so I can retrieve data from the program that's interpreting the esolang code):

prgmT[uid] = multiprocessing.Process(
    target = proxy_prgm.runCatch,
    args = (steps,),
    name="program run")

prgmT[uid].start()
prgmT[uid].join(60) #time limit of 1 minute

if prgmT[uid].is_alive():
    prgmT[uid].terminate()
    proxy_prgm.stop()

If you need more details, don't hesitate to tell me what to edit in (or ask me questions).

Upvotes: 11

Views: 3829

Answers (1)

John O'Brien
John O'Brien

Reputation: 131

Another approach that might work; using resource.setrlimit() (more details in this other StackOverflow answer). It seems that by doing so you can set a memory limit on a process and it's subprocesses; you'll have to figure out how to handle if the limit is hit though. I don't have personal experience using it, but hopefully doing so would stop Apache from locking up on you.

Upvotes: 4

Related Questions