Néstor
Néstor

Reputation: 351

Locking a file when using multiprocessing and multithreading - Python

I have a main.py file. This file uses multiprocessing to execute another file called function.py. The second one use threading to apply a function, f, to every component of a numpy array. function.py reads (only once in all the process) a file, file.txt, to read some data for f and then to clear it (write a empty file). Do I need to lock the file, file.txt, in function.py to avoid problems of having N processes, created from main.py executing function.py and reading and writing in file.txt? If so, how can it be done?

Finally I get it with a semaphore.

Upvotes: 1

Views: 1723

Answers (1)

Andreas
Andreas

Reputation: 94

Yes, in some way it has to be locked. Having several processes reading a file is no problem as long as they are only reading it. As soon as something writes to the file you have to be sure that the reads and writes occur in the desired order.

Locking could be done by using a lockfile that is atomically created. After a process successfully creates the lockfile, it gains access to the text file. After the process is done with the text file, it deletes the lockfile. This assures that only one process can access the text file at a given time.

Upvotes: 1

Related Questions