Marcvs
Marcvs

Reputation: 125

Python Concurrent logging read

I've been using ConcurrentLogHandler for multi-platform/multi-processes file logging.

Now I'd like to be sure I'm reading 'atomic' parts of the logs, that is I do not want to read half a log line for example. The concurrent file logger actually performs a LOCK_EX (exclusive) on the file, did anyone has a practice of using LOCK_SH (shared) on the ConcurrentLogHandler files? I cannot see such a read in the module services.

Or do you have such an experience in such a multiple-read/single-write using another python module? (without coding again everything by hand)

Upvotes: 0

Views: 815

Answers (1)

dano
dano

Reputation: 94891

ConcurrentLogHandler is just using the file locking tools the OS provides (fcntl.flock on Posix, win32file.LockFileEx on Windows), so there shouldn't be any issues if you take a LOCK_SH on the file; ConcurrentLogHandler will respect the lock when it tries taking its LOCK_EX. The easiest way to do it would be to use the portalocker module that's included with ConcurrentLogHandler:

import portalocker

with open("logfile.txt") as f:
    portalocker.lock(f, portalocker.LOCK_SH)
    for line in f:
        # do stuff with each line
# file will be unlocked when its closed.

Upvotes: 1

Related Questions