Reputation: 103
I have two Python scripts that I'm running in a container. The first script loads some data from disk, does some manipulation, and then saves the output in the container. The second script does a similar thing, again saving output on the container. However, once these scripts are done running, my container is basically "done" and Kubernetes basically re-deploys the same build, forever. I want to be able to run these scripts once but be able to access those results whenever, without the container continuously being built.
Here's my Dockerfile, generally:
FROM X
...
RUN python3 script1.py
RUN python3 script2.py
Currently I'm trying CMD sleep infinity
to try to access the container through the shell later, but that isn't working. I've also tried ENTRYPOINT ["sh"]
, to no avail.
So generally, the Dockerfile I'm now using looks like this:
FROM X
...
RUN python3 script1.py
RUN python3 script2.py
CMD sleep infinity
Upvotes: 2
Views: 640
Reputation: 58523
In Kubernetes/OpenShift you would use a Job. But to save results you will also need to claim a persistent volume and mount it into the Pod for the Job, giving you a place to save the results. You could create a temporary pod later on to access the results from the persistent volume.
Upvotes: 1
Reputation: 923
You need to use docker-compose here .What you can do mount a /var/opt/logs from your host container , or whichever directory you want, to the same directory inside container where your logs are stored .Then both directories and files will be synced irrespective of your container is up or down .Thats how i do to mount the scripts , which i need when container is up and running.
Upvotes: 0