Reputation: 17234
I'm using Pool
under multiprocessing
to do some stuff.
def my_func(...):
#Different processes can take different time
print a, b, c #Value that I calculated above. (includes 2 new-lines)
There have been instances where values of a, b, c for a process are not printed together. Is there a way to avoid that? Can anyone explain as to what's happening here and how it can be avoided?
My understanding says that if I remove all newlines in the print and keep only one at the end, it should fix the problem. (Problem is, it's not reproducible every time so I'm still testing stuff).
Is there a way I can take sys.stdout
exclusively for a process and then release it while I'm printing stuff to STDOUT
?
Upvotes: 3
Views: 1477
Reputation: 178115
You could use a multiprocessing.Lock
to serialize the prints. Create the common lock in the main program and pass the same lock to all the child processes. Example:
#!python3
from multiprocessing import Process,Lock
import time
import sys
def test(n,lock):
with lock:
for i in range(20):
print(n,end='')
sys.stdout.flush()
time.sleep(.01) # needed some delay or ran too fast and didn't mix output.
print()
if __name__ == '__main__':
lock = Lock()
jobs = [Process(target=test,args=(n,lock)) for n in range(5)]
for job in jobs:
job.start()
for job in jobs:
job.join()
With with lock:
commented out:
003023120134201342013420314203140231402134203140231420134021342031402134201342013420314203142
1342
14
4
As written:
00000000000000000000
11111111111111111111
44444444444444444444
22222222222222222222
33333333333333333333
Upvotes: 5
Reputation: 3205
Maybe setting this variable before running your program will help you, it force python to flush to stdout after each write :
export PYTHONUNBUFFERED=1
See also : Disable output buffering
Upvotes: 0