Gurmokh
Gurmokh

Reputation: 2081

File does not complete write until script completes

I'm getting stuck on something i think should be simple enough. I'm creating a file containing a json string to import into a postgres database. However the file does not import even though an internal test by the python script says it is present.

However if i execute the postgres import after the script has completed it will copy fine, or if i wrap them in seperate scripts and call them from a single one it will work, but never if both requests are in the same script. I've tried close(), fsync and flush but with no luck..

can anyone help ?

The relevant code is below.

command=str("PGPASSWORD=password psql -d database -U postgres -V -c \"copy import.table from Data.txt'\"")
print command

    dataFile=open('Data.txt','w')
    for x in xx:
        xString=json.loads(data)
        xString[i]['source']=x
        xString[i]['created_at'].replace('"','')
        xStringJson=json.dumps(xString)
        dataFile.write(xStringJson)
        dataFile.close
        dataFile.flush()
        os.fsync(dataFile)
        print os.path.isfile('Data.txt')
        pg_command(command)
    i=i+1

Upvotes: 0

Views: 96

Answers (1)

zvone
zvone

Reputation: 19372

You are not closing the file.

This does nothing, becuase it is missing parenthesis:

dataFile.close

But event if it did close, it would do it in first iteration through xx.

Do it this way:

with open('Data.txt','w') as dataFile:
    for x in xx:
        # write to the file 

# when you are back here, the file is flushed, closed, and ready to be read.

Upvotes: 4

Related Questions