Reputation: 1533
I have a Django based server and I'm calling a script which does a bunch of work. This needs to be asynchronous so I'm using Popen. However for debugging I want to redirect stdout and stderr from PIPE to a file. Will this affect the asynchronous performance?
How should I make sure the file opens and closes properly when the python script itself is already done (and has made a return call.)
Upvotes: 0
Views: 674
Reputation: 19382
Popen
runs asynchronously by default.
The Popen
object can be stored and queried later.
p = subproccess.Popen(executable)
# continue with other work
p.poll() # returns None if p is still running, returncode otherwise
p.terminate() # closes the process forcefully
p.wait() # freezes the execution until process finishes - makes it synchronous
All that is needed is to store p
somewhere until its status has to be checked, e.g. in a global list of started processes, if there is no better place.
UPDATE
The problem seems to be how to open a file, give it to Popen
and then close the file when the process is finished, without keeping any references.
This can be simply done with a thread:
def run_process(executable, filename):
with open(filename, 'w') as f:
subprocess.Popen(executable, stdout=f).wait()
Thread(target=run_process, args=(executable, filename)).start()
Run it and forget about it. run_process
will freeze until the process finishes and then it will close the file, but that all happens in a different thread.
Note: you may or may not care about what happens to the thread or to the process if they are not finished when the django process finishes.
For handling that, you might create a more complex thread which you would remember and stop when before django exits; it may also close the process or wait for it to finish...
Upvotes: 1
Reputation: 77407
It depends on how asynchronous your program needs to be. Writes to stdout go to the operating system's file cache which is normally fast but may wait from time to time as data is flushed to the disk. Normally that's not a problem.
Since stdout is not a console, it will do buffered writes and that can be a problem if the program hits a non-standard error that skips flushing the pipe. The buffering can also delay what is in the file so for instance an external program cat'ing the file may not see logs right away.
The parent process should close its view of the file after starting the child and the operating system will take care of closing the file when the process terminates.
with open('stdout', 'w') as out, open('stderr', 'w') as err:
proc = subprocess.Popen(['myprog'], stdout=out, stderr=err)
Upvotes: 1