Reputation: 43
I'm trying to use AWS Secret manager to set user names and passwords for a postgres database, and for other containers to connect with it. Currently the secrets are set correctly and the postgres statefulset stand up correctly, but I'm not sure how to pass those into a config file for another container?
I have mounted the secrets as env variables to the container which I can successfully see get set. Then I tried using:
dsn: postgres://$POSTGRES_USER:$POSTGRES_PASSWORD@postgres-headless-svc.{{ target.name }}.svc.cluster.local:5432/$POSTGRES_DB
I also wrapped the env variables with:
${var}
But when the init job runs it fails as it cannot connect to the postgres database.
Upvotes: 1
Views: 70
Reputation: 11
This can be done by using AWS SDKs, AWS CLI, or by using environment variables.
Using AWS SDK for example in Python or Node.js, Install the AWS SDK in your container's environment and then use the SDK to retrieve the secret when your application starts up.
Using AWS CLI in Entry Script: You can add a script to the entrypoint of your container to retrieve secrets via AWS CLI, Ensure that jq is installed in the container to parse the JSON output.
Using Environment Variables in Kubernetes: If you are deploying on Kubernetes, you can mount secrets as environment variables or volumes.
Now that your application container has access to the secrets, modify the configuration file or the startup script of your application to read these values from the environment variables or the file where the secrets are stored.
Upvotes: 0