Neil
Neil

Reputation: 8247

statement too large while pushing data into redshift database from python

I am pushing pandas dataframe in redshift table and getting following error

cur.execute("INSERT INTO sir_main VALUES " + str(args_str))
psycopg2.ProgrammingError: Statement is too large. Statement Size: 58034743 
bytes. Maximum Allowed: 16777216 bytes`

And it halts the execution. Is there any way to configure the limit while pushing into database?

Upvotes: 4

Views: 3101

Answers (1)

Joe Harris
Joe Harris

Reputation: 14035

If you are loading more than a few hundred rows you should save the dataframe as a flat file to S3 and load it into Redshift using COPY. https://docs.aws.amazon.com/redshift/latest/dg/r_COPY.html

Upvotes: 2

Related Questions