Reputation: 1
I have a scenario where in there are 2 processes (Log_writer1.py and Log_writer2.py) running (as cron jobs) which are eventually writing to the same log file(test_log_file.txt) as part of the log_event function. Because of multiple locks, there are inconsistencies and all data are not being stored in the log file. Is there any way that a single lock can be shared between multiple processes to avoid these inconsistencies. Here are the below code snippets. Kindly suggest
Script : test_cifs_log_writer.py
=================================================================================================
def log_event(level, msg, job_log_file,lck):
lck.acquire()
for i in range(50):
with open(job_log_file, 'a') as wr_log:
print('Now printing message : '+str(msg))
wr_log.write(str(time.ctime())+' - '+level.upper()+' - '+str(msg)+'\n')
lck.release()
Script : Log_writer1.py
=================================================================================================
from threading import Thread, Lock
from test_cifs_log_writer import *
lck=Lock()
t1=Thread(target=log_event, args=('info','Thread 1 : msg','test_log_file.txt',lck))
t2=Thread(target=log_event, args=('info','Thread 2 : msg','test_log_file.txt',lck))
lst=[t1,t2]
for thr in lst:
thr.start()
for thr in lst:
thr.join()
Script : Log_writer2.py
=================================================================================================
from threading import Thread, Lock
from test_cifs_log_writer import *
lck=Lock()
t1=Thread(target=log_event, args=('info','Thread 3 : msg','test_log_file.txt',lck))
t2=Thread(target=log_event, args=('info','Thread 4 : msg','test_log_file.txt',lck))
lst=[t1,t2]
for thr in lst:
thr.start()
for thr in lst:
thr.join()
Upvotes: 0
Views: 40
Reputation: 168957
No, not an easy way. Even if you could share a lock, you'd then run into lock contention issues
Either:
SyslogHandler
does.)Upvotes: 1