Moosa
Moosa

Reputation: 3216

Exclude specific rows in COPY command on RedShift

I'm using the COPY command to load a csv file from S3 into a Redshift table. It's a 5 million row file and I get a load error saying row 259123 has text instead of a number.

Is there an option to exclude that row when loading? I'm guessing there will be a few more like that so I'm looking for a way to exclude a set of specific rows from the import.

copy newtable from 's3://data.csv' credentials
'aws_access_key_id=ttt;aws_secret_access_key=ttt' 
delimiter ',' IGNOREHEADER as 1

Upvotes: 0

Views: 3930

Answers (1)

Soner
Soner

Reputation: 126

You can use "ESCAPE maxerror error_count". Let say you want to escape 1 error then:

 copy newtable from 's3://data.csv' credentials
'aws_access_key_id=ttt;aws_secret_access_key=ttt' 
 delimiter ',' IGNOREHEADER as 1`ESCAPE maxerror 1

Amazon redshift documentation

Upvotes: 2

Related Questions