Reputation: 39
I have a python application, I am creating docker image for the app. sample ( docker file below) . I want to run the container in an aws ec2 or a aws batch job. can i pass some arguments, when running docker container, such that it can be passed to my python app and the app can use the values at runtime. is it possible to pass argument as when starting a batch job, that can be mapped to some env variables defined in my app?
FROM python:3.7
ADD test.py /
RUN pip install ...
CMD [ "python3", "./test.py" ]
Upvotes: 0
Views: 570
Reputation: 59946
Is it possible to pass the argument as when starting a batch job, that can be mapped to some env variables defined in my app?
When you can pass environment variable to AWS batch job why you need to map the application environment variable with argument?
You can set the environment variable in Job Definitions and consume these variables inside the python app.
Below configuration is similar to docker run -it -e BATCH_FILE_S3_URL="s3://my-batch-scripts/myjob.sh"
Set environment variable in Job Definitions
{
"containerProperties": {
.
.
.
"environment": [
{
"name": "BATCH_FILE_S3_URL",
"value": "s3://my-batch-scripts/myjob.sh"
},
{
"name": "BATCH_FILE_TYPE",
"value": "script"
}
]
}
In python application,
S3_URL=os.getenv('BATCH_FILE_S3_URL')
# S3_URL will be `s3://my-batch-scripts/myjob.sh`
Upvotes: 1