Raj Raj
Raj Raj

Reputation: 1

python single file multiple lock issue

I have a scenario where in there are 2 processes (Log_writer1.py and Log_writer2.py) running (as cron jobs) which are eventually writing to the same log file(test_log_file.txt) as part of the log_event function. Because of multiple locks, there are inconsistencies and all data are not being stored in the log file. Is there any way that a single lock can be shared between multiple processes to avoid these inconsistencies. Here are the below code snippets. Kindly suggest

Script : test_cifs_log_writer.py
=================================================================================================

def log_event(level, msg, job_log_file,lck):
    lck.acquire()
    for i in range(50):
        with open(job_log_file, 'a') as wr_log:
            print('Now printing message : '+str(msg))
            wr_log.write(str(time.ctime())+' - '+level.upper()+' - '+str(msg)+'\n')
    lck.release()


Script : Log_writer1.py
=================================================================================================

from threading import Thread, Lock
from test_cifs_log_writer import *

lck=Lock()
t1=Thread(target=log_event, args=('info','Thread 1 : msg','test_log_file.txt',lck))
t2=Thread(target=log_event, args=('info','Thread 2 : msg','test_log_file.txt',lck))

lst=[t1,t2]

for thr in lst:
    thr.start()

for thr in lst:
    thr.join()

Script : Log_writer2.py
=================================================================================================

from threading import Thread, Lock
from test_cifs_log_writer import *

lck=Lock()
t1=Thread(target=log_event, args=('info','Thread 3 : msg','test_log_file.txt',lck))
t2=Thread(target=log_event, args=('info','Thread 4 : msg','test_log_file.txt',lck))

lst=[t1,t2]

for thr in lst:
    thr.start()

for thr in lst:
    thr.join()

Upvotes: 0

Views: 40

Answers (1)

AKX
AKX

Reputation: 168957

No, not an easy way. Even if you could share a lock, you'd then run into lock contention issues

Either:

  • (easiest, just requires an extra step later) have each process write to a separate, uniquely named log file and concatenate them afterwards if you need to.
  • (harder, requires an extra process and communication) delegate writing the single log file to a single process that the other processes communicate with. (This is essentially what SyslogHandler does.)
  • (worst, slow and lock-contented) have each process lock, open, write, close, unlock the file for each entry they want to write

Upvotes: 1

Related Questions