Reputation: 68007
What is the recommended way to pass sensitive environment variables, e.g. passwords, to Amazon ECS tasks? With Docker Compose, I can use key-only environment variables, which results in the values being read from the OS environment. I can't see any corresponding method for ECS task definitions however.
Upvotes: 15
Views: 12440
Reputation: 4012
In the task definition link you posted there is an "environment" section that allows you to do this. They become environment variables inside the container.
If you mean you would like to keep information outside of the task definition and that task could reference it, you cannot. Your best bet in that case is to have your container pull that information from an outside source and not have the ECS task config try to reference it.
Edit: Im getting downvoted at this point because the parameter store is now the right way to do it. At the time this answer was the most correct way, but the other solutions using SSM are the right way now.
Upvotes: 5
Reputation: 1576
If AWS baked in secrets environments is not an option, you could also use docker entrypoint to inject secrets at container runtime like in https://medium.com/@zdk/simple-and-secure-way-to-pass-secrets-and-credentials-into-docker-containers-c2f66175b0a4
Upvotes: 0
Reputation: 4562
Approach 1:
You can use Parameter Store to store the variables. If you store them as SecureString
, the values will be encrypted.
You can reference them as environment variables in the task definition.
You need to retrieve them in the container startup script
value_from_parameter_store =`aws ssm get-parameter --name $parameter_store_key --with-decryption --output text --query Parameter.Value --region $REGION `
You can also mention parameter_store_key
as an environment variable. so that you can use $parameter_store_key
Example
Dockerfile:
FROM ubuntu
//some other steps
CMD ["sh","/startup.sh"]
startup script:
#! /bin/bash
export db_password =`aws ssm get-parameter --name $parameter_store_key --with-decryption --output text --query Parameter.Value --region $REGION `
// Please note that above line has `(backtick)
// Do other stuff and use this password
Put parameter in SSM:
aws ssm put-parameter --name 'db_password' --type "SecureString" --value 'P@ssW%rd#1'
Docker run command:
docker run -e parameter_store_key=db_password -e REGION=us-east-1 <docker_image>
Approach 2:
Recently AWS announced secrets support in ContainerDefinition for ECS Using Secrets in ECS
Upvotes: 15
Reputation: 18824
Parameter store is the way to go, it stores the variables encrypted using a KMS key.
Amazon has just announced support for specifying secrets in the task definition. Reference the parameter value from the SSM and the environment variable to set with the task.
{
....
"secrets": [
{
"name": "environment_variable_name",
"valueFrom": "arn:aws:ssm:region:aws_account_id:parameter/parameter_name"
}
]
}
See the official docs here.
There's also a project called chamber that can load all parameters from a given path in SSM and set them as environment variables.
Upvotes: 8
Reputation: 66
if you use environment variables they can be seen when you log into the AWS console. AWS have written a guide on using proper "secrets" to keep your sensitive data hidden. The containers load these on startup into memory based environment variables. Here's a guide: https://aws.amazon.com/blogs/security/how-to-manage-secrets-for-amazon-ec2-container-service-based-applications-by-using-amazon-s3-and-docker/
Upvotes: 4
Reputation: 1956
Implementation of pulling environment variables from S3 that Marc Young mentioned:
https://www.promptworks.com/blog/handling-environment-secrets-in-docker-on-the-aws-container-service
Upvotes: 3