Reputation:
HI I have the following function to save a csv file:
mysql_cur = mysql_conn.cursor()
mysql_cur.execute('select * from %s where Date > "2018-01-01";' % mysql_table_name )
description = mysql_cur.description
rows = mysql_cur.fetchall()
for row in rows:
c.writerow(row)
Then I manually upload the file to an S3 bucket.
Last I run the COPY query:
redshift_cur = redshift_conn.cursor()
sql = """copy kpi_kpireport from 's3://clab-migration/kpi.csv'
credentials 'aws_access_key_id=ID;aws_secret_access_key=KEY/KEY/pL/KEY'
csv;"""
print(redshift_cur.execute(sql))
The COPY command works when I use it in SQL Workbench/J, I am just not sure what I am doing wrong here with the syntax in the execute statement that does not copy any rows.
Upvotes: 3
Views: 8233
Reputation: 5729
Actually, the reason you are not seeing data into Redshift seems like you have not enabled Auto-Commit, hence, your commands executed successfully, but it does copy
data into Redshift
, but doesn't commit. Hence, you don't see data when you select
by querying from console
or your WorkBench/J
.
You should be beginning and committing transaction explicitly. I'm just giving you a simple working example.
import psycopg2
def redshift():
conn = psycopg2.connect(dbname='**_dev_**', host='888888888888****.u.****.redshift.amazonaws.com', port='5439', user='******', password='********')
cur = conn.cursor();
# Begin your transaction
cur.execute("begin;")
cur.execute("copy kpi_kpireport from 's3://clab-migration/kpi.csv' credentials 'aws_access_key_id=ID;aws_secret_access_key=KEY/KEY/pL/KEY' csv;")
# Commit your transaction
cur.execute("commit;")
print("Copy executed fine!")
redshift();
Now, if you run above code, you will see the data copied.
If you remove two lines from above code, cur.execute("begin;")
, cur.execute("commit;")
run it, even though run successfully without error, you will not see data in Redshift.
Upvotes: 5