psk98
psk98

Reputation: 71

Export from Aurora Serverless into S3

I am currently trying to export a .csv file from AWS Aurora Serverless with a MySQL Engine to AWS S3. This works perfectly fine when using an Instance backed Aurora but not with the serverless. I set up IAM Policies for the S3 as written in the documentation and made an IAM User with Full Access to S3 and RDS. The Parameter Group which the Serverless is using has the ARN of the IAM User in the LOAD DATA FROM S3 and SELECT INTO OUTFILE S3 fields.

The Code i am using to Export to S3:

SELECT * FROM TABLE WHERE ID = '6838' INTO OUTFILE S3 's3://bucketname/file.csv';

I've read the documentation here :https://docs.aws.amazon.com/de_de/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.SaveIntoS3.html

The Errormessage: Access denied; you need (at least one of) the SELECT INTO S3 privilege(s) for this operation

I expect the Aurora serverless to load as smooth as Aurora with Instances is doing. The Aurora with Instances is using the IAM User which was created for the serverless and is working fine.

EDIT: It seems that AWS Aurora Serverless does not support either IMPORT or EXPORT from and to S3. (https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/aurora-serverless.html#aurora-serverless.limitations)

Upvotes: 7

Views: 6058

Answers (3)

enharmonic
enharmonic

Reputation: 2080

If you're OK with temporarily putting the data on a local or EC2 instance, you can do it in two steps:

  1. Pull the data from the database: mysql --defaults-file=/path/to/.my.cnf -e "SELECT * FROM TABLE WHERE ID = '6838'" | tr '\t' ',' > /local/path/to/file.csv If you're OK with tab-separated output, you don't need to include tr in the pipeline.

  2. Copy the file to S3: aws s3 cp /local/path/to/file.csv s3://bucketname/file.csv

References

Copying to s3

Using MySQL options files

Upvotes: 0

Linda Naoui
Linda Naoui

Reputation: 147

Actually you can load your data only in the Aurora Cluster not the serverless one, I had the same issue maybe you can consider integrating your data into an Amazon RDS MySQL database and then let lambda do your data injection from S3 to RDS MySQL.

Upvotes: 3

Ashaman Kingpin
Ashaman Kingpin

Reputation: 1577

Did you complete this step (assuming you are issuing that statement from the non-master DB account):

The database user that issues the SELECT INTO OUTFILE S3 statement must be granted the SELECT INTO S3 privilege to issue the statement. The master user name for a DB cluster is granted the SELECT INTO S3 privilege by default. You can grant the privilege to another user by using the following statement.

GRANT SELECT INTO S3 ON *.* TO 'user'@'domain-or-ip-address'

Upvotes: 1

Related Questions