manugupt1
manugupt1

Reputation: 2437

Batch updating Updating over 200000 rows in a table using psycopg2

Hi I am trying to update over 200000 rows (exact : 273649).

I wrote the following code to update:

with open('gene.csv') as csvfile:
    reader = csv.DictReader(csvfile)
    for row in reader:
            sql = "UPDATE sequence_group_annotation SET STATUS = '" + row['status'] + "', UPDATED_DATE=CURRENT_DATE where seq_id_pk='" + row['ensembl_gene_id'] + "'"
            print sql
            cur.execute(sql);

At the print sql statement, I am redirecting my file :

update.py > results.txt

Ideally this should update the table, but it is not doing so.

To check, what I did was the following :

select seq_id_pk,status from sequence_group_annotation where status ISNULL and TYPE <> 3

and looked for for the corresponding seq_id_pk in my results.txt file

I copied the sql statement and ran it. I was able to successfully update it after that.

Any idea on why this is not working?

Upvotes: 0

Views: 642

Answers (1)

fog
fog

Reputation: 3391

You forgot to commit() your changes. psycopg begins an implicit transaction for you on the first execute() and you need to commit() or rollback() on the connection object at the end.

Upvotes: 2

Related Questions