bluefoggy
bluefoggy

Reputation: 1061

Python: Two script working with same file , one updating it another deleting the data when processed

Firstly I am new to Python. Now my question goes like this:

I have a call back script running in remote machine which sends some data and run a script in local machine which process that data and write to a file. Now another script of mine locally needs to process the file data one by one and delete them from the file if done. The problem is the file may be updating continuoulsy. How do i schyncronize the work so that it doesnt mess up my file. Also please suggest me if the same work can be done in some better way.

Upvotes: 0

Views: 1209

Answers (2)

Jonas Schäfer
Jonas Schäfer

Reputation: 20738

I would suggest you to look into named pipes or sockets which seem to be more suited for your purpose than a file. If it's really just between those two applications and you have control on the source code of both.

For example, on unix, you could create a pipe like (see os.mkfifo):

import os
os.mkfifo("/some/unique/path")

And then access it like a file:

dest = open("/some/unique/path", "w")  # on the sending side
src = open("/some/unique/path", "r")   # on the reading side

The data will be queued between your processes. It's a First In First Out really, but it behaves like a file (mostly).

If you cannot go for named pipes like this, I'd suggest to use IP sockets over localhost from the socket module, preferably DGRAM sockets, as you do not need to do some connection handling there. You seem to know how to do networking already.

Upvotes: 1

Ignacio Vazquez-Abrams
Ignacio Vazquez-Abrams

Reputation: 799190

I would suggest using a database whose transactions allow for concurrent processing.

Upvotes: 0

Related Questions