Amaranth
Amaranth

Reputation: 2491

Concurrent access to a data file in Python

I have a small web server doing some operations on a POST request. It reads a data file, does some checks, and then resaves the file adding some informations from the POST in it.

The issue I have is that if two clients are doing a POST request at almost the same time, both will read the same file, then one will write the file containing the new information, and then the other client will write the file containing its new information, but without the information from the other client, since that part wasn't in the file when it was read.

f = open("foo.txt", "r+")
tests_data = yaml.safe_load(f)
post_data = json.loads(web.data())
#Some checks

f.write(json.dumps(tests_data))
f.close()

I wanted the script to "wait", without giving an error, at the "open" line if the file is already opened by another process of the same code, then read the file when the other process is done and has closed the file.

Or something else if other solutions exist.

Upvotes: 1

Views: 3621

Answers (1)

user163757
user163757

Reputation: 7025

Would a standard lock not suit your needs? The lock would need to be at the module level.

from threading import Lock
# this needs to be module level variable
lock = Lock 

with lock:
    # do your stuff.  only one thread at a time can
    # work in this space...

Upvotes: 3

Related Questions