Rich Lucking
Rich Lucking

Reputation: 21

Automatically restarting Python sub-processes using identical arguments

I have a python script which calls a series of sub-processes. They need to run "for ever" - but they occasionally die, or get killed. When this happens I need to restart the process using the same arguments as the one which died.

This is a very simplified version: [edit: this is the less simplified version, which includes "restart" code]

import multiprocessing
import time
import random

def printNumber(number):
    print("starting :", number)
    while random.randint(0, 5) > 0:
        print(number)
        time.sleep(2)

if __name__ == '__main__':

    children = [] # list
    args = {} # dictionary
    for processNumber in range(10,15):
        p = multiprocessing.Process(
                target=printNumber,
                args=(processNumber,)
                )
        children.append(p)
        p.start()
        args[p.pid] = processNumber

    while True:
        time.sleep(1)
        for n, p in enumerate(children):
            if not p.is_alive():
                #get parameters dead child was started with
                pidArgs = args[p.pid]
                del(args[p.pid])
                print("n,args,p: ",n,pidArgs,p)
                children.pop(n)

                # start new process with same args
                p = multiprocessing.Process(
                    target=printNumber,
                    args=(pidArgs,)
                )
                children.append(p)
                p.start()
                args[p.pid] = pidArgs

I have updated the example to illustrate how I want the processes to be restarted if one crashes/killed/etc - keeping track of which pid was started with which args.

Is this the "best" way to do this, or is there a more "python" way of doing this?

Upvotes: 0

Views: 74

Answers (3)

Paul Cornelius
Paul Cornelius

Reputation: 10946

I think I would create a separate thread for each Process and use a ProcessPoolExecutor. Executors have a useful function, submit, which returns a Future. You can wait on each Future and re-launch the Executor when the Future is done. Arguments to the function are tracked as class variables, so restarting is just a simple loop.

import threading
from concurrent.futures import ProcessPoolExecutor
import time
import random
import traceback

def printNumber(number):
    print("starting :", number)
    while random.randint(0, 5) > 0:
        print(number)
        time.sleep(2)
        
class KeepRunning(threading.Thread):
    def __init__(self, func, *args, **kwds):
        self.func = func
        self.args = args
        self.kwds = kwds
        super().__init__()
        
    def run(self):
        while True:
            with ProcessPoolExecutor(max_workers=1) as pool:
                future = pool.submit(self.func, *self.args, **self.kwds)
                try:
                    future.result()
                except Exception:
                    traceback.print_exc()

if __name__ == '__main__':
    for process_number in range(10, 15):
        keep = KeepRunning(printNumber, process_number)
        keep.start()
    while True:
        time.sleep(1)
    

At the end of the program is a loop to keep the main thread running. Without that, the program will attempt to exit while your Processes are still running.

Upvotes: 1

Barak Fatal
Barak Fatal

Reputation: 178

Instead of just starting the process immediately, you can save the list of processes and their arguments, and create another process that checks they are alive.

For example:

if __name__ == '__main__':
    process_list = []

    for processNumber in range(5):
        process = multiprocessing.Process(
            target=printNumber,
            args=(processNumber,)
            )
        process_list.append((process,args))
        process.start()
    while True:
        for running_process, process_args in process_list:
            if not running_process.is_alive():
                new_process = multiprocessing.Process(target=printNumber, args=(process_args))
            process_list.remove(running_process, process_args) # Remove terminated process
            process_list.append((new_process, process_args))

I must say that I'm not sure the best way to do it is in python, you may want to look at scheduler services like jenkins or something like that.

Upvotes: 0

ShadowCrafter_01
ShadowCrafter_01

Reputation: 604

For the example you provided I would just remove the exit condition from the while loop and change it to True.

As you said though the actual code is more complicated (why didn't you post that?). So if the process gets terminated by lets say an exception just put the code inside a try catch block. You can then put said block in an infinite loop.

I hope this is what you are looking for but that seems to be the right way to do it provided the goal and information you provided.

Upvotes: 0

Related Questions