Reputation: 14661
I have a PostgreSQL db. Pandas has a 'to_sql' function to write the records of a dataframe into a database. But I haven't found any documentation on how to update an existing database row using pandas when im finished with the dataframe.
Currently I am able to read a database table into a dataframe using pandas read_sql_table. I then work with the data as necessary. However I haven't been able to figure out how to write that dataframe back into the database to update the original rows.
I dont want to have to overwrite the whole table. I just need to update the rows that were originally selected.
Upvotes: 27
Views: 12716
Reputation: 1
For sql alchemy case of read table as df, change df, then update table values based on df, I found the df.to_sql to work with name=<table_name> index=False if_exists='replace'
This should replace the old values in the table with the ones you changed in the df
Upvotes: 0
Reputation: 1626
One way is to make use of an sqlalchemy "table class" and session.merge(row), session.commit():
Here is an example:
for row in range(0, len(df)):
row_data = table_class(column_1=df.ix[i]['column_name'],
column_2=df.ix[i]['column_name'],
...
)
session.merge(row_data)
session.commit()
Upvotes: 5