Reputation: 171
I'm writing a python 3 program that will start another python program to process files. I need the main program to keep an arbitrary number of processes running at any time. I'll read the arbitrary number from a file each time one of the processes finishes to see if the another process should be spawned.
What is the best way to spawn the processes? I need to know when the process finishes so I can spawn another, but that is all. And, I would really like the spawned program to execute as a completely separate process. The processes are long running and utilize a lot of resources. I don't need to know about the output or exit status of the program, just that it exited.
I've considered using mutex files created by the processes, but I've had dodgy results with those.
Any suggestions?
Upvotes: 0
Views: 92
Reputation: 3520
In Python, you could use os.system(), os.execl* family, subprocess, multiprocessing to spawn a new process. You could choose one based on your requirements.
The parent process could use signal SIGCHLD or os.waitpid, os.wait* to know when the child process terminated. You could see the source code of subprocess.call() for more details.
SIGCHLD in linux, is A trap signal that indicates a process started by the current process has terminated.
At last, I give a simple example to show how to use subprocess (Python 3)
import subprocess
subprocess.call(['ls', '-1'], shell=True)
print('main exit')
Upvotes: 1
Reputation: 324
I would recommend you to look into the multiprocessing library.
https://docs.python.org/3/library/multiprocessing.html
It spawns a separate python processes which you can keep track of and communicate with using queues.
#!/usr/bin/env python3
import multiprocessing
class MyFancyClass(object):
def __init__(self, name):
self.name = name
def do_something(self):
proc_name = multiprocessing.current_process().name
print('Doing something fancy in %s for %s!' % (proc_name, self.name))
def worker(q):
obj = q.get()
obj.do_something()
if __name__ == '__main__':
queue = multiprocessing.Queue()
p = multiprocessing.Process(target=worker, args=(queue,))
p.start()
queue.put(MyFancyClass('Fancy JimF'))
# Wait for the worker to finish
queue.close()
queue.join_thread()
p.join()
Example is from this article https://pymotw.com/2/multiprocessing/communication.html but updated slightly to run in python3 (it's very similar in this instance).
Upvotes: 1