Yoli
Yoli

Reputation: 65

Redshift Unload to S3 Location that is a Concatenated String

I'm trying to unload the results from a redshift query to a bucket folder that is yesterday's date.

So in the below example the data should be in s3://mybucket/reporting/20170321/myreport-000

unload ('select * from #my_report')
to 's3://mybucket/reporting/' || replace(trunc(current_date - 1),'-','') || '/myreport'
credentials 'aws_access_key_id=key;aws_secret_access_key=secretkey'
delimiter ','
gzip addquotes null as ''
PARALLEL off
allowoverwrite;

However I get the error

[Amazon](500310) Invalid operation: syntax error at or near "||";

Does the string concatenation not evaluate first?

Upvotes: 3

Views: 2074

Answers (1)

Rahul Gupta
Rahul Gupta

Reputation: 1802

It's not possible to dynamically build the UNLOAD path statement.

Two methods I can suggest:

  • Method 1: I would suggest writing a shell script/Python/Javascript where you can create variables, then pass in those variables into a connection's statement-query.
  • Method 2: Or you can use RedshiftCopyActivity in AWS Data Pipeline to copy from Redshift to S3. Input is RedshiftDataNode and output is S3DataNode where you can specify expression for directoryPath.

    If I’ve made a bad assumption please comment and I’ll refocus my answer.

Upvotes: 3

Related Questions