Reputation: 65
I'm trying to unload the results from a redshift query to a bucket folder that is yesterday's date.
So in the below example the data should be in s3://mybucket/reporting/20170321/myreport-000
unload ('select * from #my_report')
to 's3://mybucket/reporting/' || replace(trunc(current_date - 1),'-','') || '/myreport'
credentials 'aws_access_key_id=key;aws_secret_access_key=secretkey'
delimiter ','
gzip addquotes null as ''
PARALLEL off
allowoverwrite;
However I get the error
[Amazon](500310) Invalid operation: syntax error at or near "||";
Does the string concatenation not evaluate first?
Upvotes: 3
Views: 2074
Reputation: 1802
It's not possible to dynamically build the UNLOAD
path statement.
Two methods I can suggest:
RedshiftCopyActivity
in AWS Data Pipeline to copy from Redshift to S3. Input is RedshiftDataNode and output is S3DataNode where you can specify expression for directoryPath.
Upvotes: 3