Phillip B Oldham
Phillip B Oldham

Reputation: 19385

Fastest way to write to a txt file in Python

I have an odd situation: I have inherited a project which has a number of Python processes running concurrently doing various things. These processes are spun-up independently of each other; multiprocessing/threading isn't involved.

I'd like to add some functionality where they write to a single text file with a one-line update when certain events occur, which I can then parse much later on (probably on a separate server) to gather statistics.

I'm looking for a way to append this line to the file that won't cause problems if one of the other processes is trying to do the same thing at the same time. I don't want to add any other software to the stack if possible.

Suggestions?

Upvotes: 2

Views: 706

Answers (4)

ndpu
ndpu

Reputation: 22561

Use queue http://docs.python.org/2/library/multiprocessing.html#multiprocessing.Queue. All your processes can put data to queue and one of them get data and write to file.

Upvotes: 1

chthonicdaemon
chthonicdaemon

Reputation: 19770

The logging module is thread-safe by design, it's in the standard library and can write to files.

Upvotes: 0

Alex W
Alex W

Reputation: 38193

Sounds like you may want to try using a file lock:

lock = FileLock("/some/file/or/other")
while not lock.i_am_locking():
    try:
        lock.acquire(timeout=60)    # wait up to 60 seconds
    except LockTimeout:
        lock.break_lock()
        lock.acquire()
print "I locked", lock.path
lock.release()

http://pythonhosted.org/lockfile/lockfile.html#filelock-objects

Upvotes: 4

Steve Barnes
Steve Barnes

Reputation: 28370

Have a single file writing process and have the others queue messages for it to write. This could be by event raising, message queues, pyro, publish, your choice!

Upvotes: 0

Related Questions