Reputation: 2120
I have a python program with multiple threads. Each thread detects events, which I would like to store somewhere so that I can read them in again (for testing). Right now, I'm using Pickle to output the events, and each thread outputs to a different file. Ideally, I would only use one output file, and all the threads would write to it, but when I try this, it looks like the various threads try to write their output at the same time, and they don't get pickled properly. Is there a way to do this?
Upvotes: 5
Views: 7634
Reputation: 14075
Here is an example using threading.Lock():
import threading
import pickle
picke_lock = threading.Lock()
def do(s):
picke_lock.acquire()
try:
ps = pickle.dumps(s)
finally:
picke_lock.release()
return ps
t1 = threading.Thread(target=do, args =("foo",))
t2 = threading.Thread(target=do, args =("bar",))
p1 = t1.start()
p2 = t2.start()
inpt = raw_input('type anything and click enter... ')
Upvotes: 1
Reputation: 60604
seems like a good place to use a Queue
.
from the Queue docs:
"The Queue module implements multi-producer, multi-consumer queues. It is especially useful in threaded programming when information must be exchanged safely between multiple threads. The Queue class in this module implements all the required locking semantics. It depends on the availability of thread support in Python; see the threading module."
Upvotes: 4
Reputation: 879481
The logging module has a Rlock
built into its Handlers. So you could logging
as normal (just create a handler to log to a file.)
Upvotes: 1
Reputation: 500337
You could create a lock and acquire/release it around every call to pickle.dump()
.
Upvotes: 1
Reputation: 80761
Yes, with threading.Lock() objects. You create a lock before creating all your threads, you give it to the method that is responsible of saving/pickling items, and this method should acquire the lock before writing into the file and releasing it after.
Upvotes: 2