Reputation: 5220
As part of my Flask
and Celery
application, I'm trying to move data from AWS-Aurora Postgres DB to Redshift.
I'll be running this application in Kubernetes.
My approach is to query the aurora Postgres database and write the result set to a CSV file which is saved on to an attached volume
and then upload it to S3
and then import the file into Redshift
.
However, I came across another article which lets us directly upload the result set as a CSV
file to S3
instead of having an intermediate volume.
https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.SaveIntoS3.html
They've mentioned the usage of OUTFILE
command. But it's mentioned about MySQL
. But they haven't mentioned anything about Postgres DB
.
Is it even possible to use the command on Aurora Postgres DB
and export to S3
.
Upvotes: 1
Views: 2688
Reputation: 2977
Yes, Aurora runs on MySQL so you can use the outfile
command. Did you even try running a query with outfile
?
Upvotes: 1
Reputation: 1630
If you can connect to the database with psql, you can use the \copy command to export the output from any select statement to a csv:
https://codeburst.io/two-handy-examples-of-the-psql-copy-meta-command-2feaefd5dd90
Upvotes: 1