Lazloo Xp
Lazloo Xp

Reputation: 998

Synchronize Data Update Using PyODBC and SQL-Alchemy

I have a script that is used by several users. In this script, a database table is updated by first deleting and subsequently uploaded.

import pyodbc
import sqlalchemy

engine = sqlalchemy.create_engine('credtials', echo=False)
cnxn = pyodbc.connect('credentials')

with pyodbc.connect('...') as cnxn:
    mycursor = cnxn.cursor()
    mycursor.execute('query_delete')
    mycursor.commit()
    mycursor.close()

# Precaluclated dataframe that contains the data (calculation not shown)
df.to_sql(name='db_table', con=engine, index=False, if_exists='append')

The problem is that sometimes it happens that two processes are badly timed. In this case, the sequence of processes is

delete (process 1) delete (Process 2) upload (process 1) upload (process 2)

If this happens, the data are duplicated. Is there a solution to prevent this issue?

Upvotes: 0

Views: 375

Answers (1)

Gord Thompson
Gord Thompson

Reputation: 123819

You'll probably need to tighten up your transaction isolation using code similar to this (untested):

engine = sqlalchemy.create_engine(connection_uri, isolation_level='SERIALIZABLE')
with engine.begin() as conn:
    conn.execute(sqlalchemy.text('DELETE FROM db_table WHERE …'))
    df.to_sql(name='db_table', con=conn, index=False, if_exists='append')
print('Update complete.')

Upvotes: 1

Related Questions