Pratibha UR
Pratibha UR

Reputation: 138

UNLOAD to a new file when running in Redshift

I am trying to UNLOAD the file to an S3 Bucket. However, I DONT want to overwrite, but create a new file everytime I run the command. How can I achieve this?

unload ('select * from table1')
to 's3://bucket/file1/file2/file3/table1.csv' 
iam_role 'arn:aws:iam::0934857378:role/RedshiftAccessRole,arn:aws:iam::435874575846546:role/RedshiftAccessRole'
DELIMITER ','
PARALLEL OFF 
HEADER

Upvotes: 0

Views: 414

Answers (1)

John Rotenstein
John Rotenstein

Reputation: 270104

Just change the destination path specified in the "TO" section.

If you wish to do this programmatically, you could do it in whatever script/command sends the UNLOAD command.

You might be able to do it via a Stored Procedure by keeping a table with the last file number and writing code to retrieve and increment it.

Or, you could write an AWS Lambda function that is triggered upon creation of the file. The Lambda function could then copy the object to a different path/filename and delete the original object.

Upvotes: 0

Related Questions