Reputation: 393
I tried to import csv file to postgres aurora rds from s3 bucket with using aws_s3 extension. When I justt did it at psql console, It first occur error and then It was success after that. Why it first occur error?
$ psql -h 127.0.0.1 -p 5432 -U user mydb
Password for user mydb:
psql (12.2, server 11.7)
SSL connection (protocol: TLSv1.2, cipher: ECDHE-RSA-AES256-GCM-SHA384, bits: 256, compression: off)
Type "help" for help.
mydb=> select *
from aws_s3.table_import_from_s3(
'my_table',
'',
'(format csv, header false)',
aws_commons.create_s3_uri(
'my_bucket',
format('imports/%s/my_file.csv', 'sub_dir'),
'ap-northeast-1'
)
);
ERROR: Unable to complete COPY from S3, HTTP error code: 168
CONTEXT: SQL function "table_import_from_s3" statement 1
mydb=>
mydb=>
mydb=> select *
from aws_s3.table_import_from_s3(
'my_table',
'',
'(format csv, header false)',
aws_commons.create_s3_uri(
'my_bucket',
format('imports/%s/my_csv.csv', 'sub_dir'),
'ap-northeast-1'
)
);
table_import_from_s3
------------------------------------------------------------------------------------------------------------------------
18638 rows imported into relation "my_table" from file imports/sub_dir/my_file.csv of 2033585 bytes
(1 row)
It repeat to happen this stranger phenomenon.
Upvotes: 0
Views: 2946
Reputation: 393
I resolved this error. The reason is that I was going to import file without specifing option of escape double quotation. I was trying to import such as the following file.
aaa,bbb,ccc
aaa,bbb,ccc
"aaa",bbb,ccc
It need to specify option of to escape double quotation.
select *
from aws_s3.table_import_from_s3(
'my_table',
'',
'(format csv, escape ''"'')',
aws_commons.create_s3_uri(
'my_bucket',
format('imports/%s/my_file.csv', 'sub_dir'),
'ap-northeast-1'
)
);
Upvotes: 1