namratha Mk
namratha Mk

Reputation: 23

problems in copying a csv file from s3 to redshift

i am getting the following error if i run a copy command to copy contents of a .csv file in s3 to a table in redshift.

error:"String length exceeds DDL length".

i am using following copy command:

COPY enjoy from 's3://nmk-redshift-bucket/my_workbook.csv' CREDENTIALS 'aws_access_key_id=”****”;aws_secret_access_key=’**** ' CSV QUOTE '"' DELIMITER ',' NULL AS '\0'

i figured lets open the link given by s3 for my file through was console. link for the work book is : link to my s3bucket cvs file

the above file is filled with many weird characters i really don't understand. the copy command is taking these characters instead of the information i have entered in my csv file.So hence leading to string length exceeded error.

i use sql workbench to query.My 'stl_load_errors' table in redshift has raw_field_values component similar to the chars in the link i mentioned above, thats how i got to know how its taking in the input

i am new to aws and utf-8 configs. so please i appreciate help on this

Upvotes: 2

Views: 1027

Answers (1)

Danny_ds
Danny_ds

Reputation: 11406

The link you provide points to a .xlsx file (but has a .csv extension instead of .xlsx), which is actually a zip file.

That is why you see those strange characters, the first 2 being 'PK', which means it is a zip file.

So you will have to export to .csv first, before using the file.

Upvotes: 4

Related Questions