Sabareesh Kappagantu
Sabareesh Kappagantu

Reputation: 2577

Iteratively insert rows into a postgreSQL database using SQLAlchemy in python

I am writing a script that reads data from excel files and writes into a postgresql database. I have read data from the excel file and stored each row data as an array of strings.

I tried a couple of things, like

  for x in range(len(sh.columns)):
        i = users.insert()
        i.execute({sh.columns[x] : sh.values[0][x]})

Here is the reference to my Excel sheet and the table I'm editing in the postgreSQL database I referred to by users.

I also tried an

'INSERT INTO users VALUES %s'%(tuple(sh.values[0])

and a few more variations of that.

Upvotes: 1

Views: 3519

Answers (2)

Sabareesh Kappagantu
Sabareesh Kappagantu

Reputation: 2577

Here is my solution. This is what I used and it works like a charm!

import pandas as pd
import sqlalchemy as sql
import sqlalchemy.orm as orm



sh = pd.ExcelFile("ExcelFileName").parse("SheetName")
db=sql.create_engine('DBtype://username:password@localhost:postnumber/dbname')
metadata = sql.schema.MetaData(bind=db,reflect=True)



table = sql.Table('TableinDB', metadata, autoload=True)

class Row(object):
 pass
rowmapper = orm.Mapper(Row,table)
print table.columns

Sess = orm.sessionmaker(bind = db)
session = Sess()

for x in range(len(sh.values)):
 row1 = Row() 
 row1.field1 = sh.values[x][0]
 row1.field2 = sh.values[x][1]
 #Add as many fields as you want
session.add(row1)
#forloop ends
session.commit()

Upvotes: 1

nom-mon-ir
nom-mon-ir

Reputation: 3918

To make sure you can write to the table, try in more raw way:

i.execute( "INSERT INTO table (field1, field2, field3) VALUES (%s, %s, %s) " % (data1, data2, data3) )

Modify the above to have appropriate table name, field names and data values.

Once this works, you can refine writing from sh.

Upvotes: 1

Related Questions