Reputation: 21
I am trying to set up a lambda to run an AWS Athena query daily and output the result to an s3 bucket stored in a different AWS Account. The account I am writing the Lambda in has s3 write permissions in the other account, I just can't figure out how to input the specific bucket I'm looking to write to, and I haven't been able to find any documentation on this use case.
The following is how I'm running my athena query from the lambda:
client = boto3.client('athena')
client.start_query_execution(
QueryString = [QUERY],
QueryExecutionContext={
'Database': [DATABASE]
},
ResultConfiguration={
'OutputLocation': [OUTPUT_LOCATION]
}
)
My query works fine when storing the result in my own AWS account. I can't just write "s3://[BUCKETNAME]" where bucket name is the name of the bucket in the other account.
I'm guessing there is something very simple I'm missing--if anyone could tell me how to format "OUTPUT_LOCATION" where ACCOUNT_ID is the id of the other account and BUCKET_NAME is the name of the bucket, that would be very helpful!
Upvotes: 2
Views: 2438
Reputation: 19
Bucket names must be unique within a partition. A partition is a grouping of Regions. AWS currently has three partitions: aws (Standard Regions), aws-cn (China Regions), and aws-us-gov (AWS GovCloud [US] Regions).
Refer this doc on the naming standards
When defining the name of an S3 bucket you do not have to specify an account ID.
For Example, an ARN for s3 bucket would be without the account ID
arn:aws:s3:::<BUCKET_NAME>
whereas an ARN for a resource or Role would have its Account ID
arn:aws:iam::<MyAccountA>:role/<MyRoleA>
client = boto3.client('athena')
client.start_query_execution(
QueryString = [QUERY],
QueryExecutionContext={
'Database': [DATABASE]
},
ResultConfiguration={
'OutputLocation': 's3://<BUCKET_NAME>`' # Even when the bucket is another AWS account
}
)
If the query fails, start looking at permissions. Your code is right.
Upvotes: 1