Reputation: 11307
I have a project which compiles to a binary file, and running that binary file exposes some REST APIs.
To compile the project I need docker image A which has the compiler and and all the libraries required to produce the executable. To run the executable (ie. host the service) I can get away with a much smaller image B (just basic linux distro, no need for the compiler).
How does one use docker is such a situation?
Upvotes: 2
Views: 426
Reputation: 212
mkdir local_dir
docker run -dv $PWD/local_dir:/mnt BUILD_CONTAINER
compile code and save it to /mnt in the container. It'll be written to local_dir on your host filesystem and persist after the container is destroyed.
You Should now write a Dockerfile and add a step to copy in the new binary, then build. But for example's sake...
docker run -dv $PWD/local_dir:/mnt PROD_CONTAINER
Your bin, and everything else in local_dir, will reside in the container at /mnt/
Upvotes: 2
Reputation: 12501
My thinking for this scenario is that you can prepare two base images:
base-image:build
base-image:runtime
And then break your build process into two steps:
base-image:build
, and then put your executable
to some place, like NFS or any registry from where you can fetch it for later use;Dockerfile
which FROM base-image:runtime
, fetch your artifact/executable from wherever generated by Step 1, docker build
your delivery image, and then docker push
to your registry for release.Hope this could be helpful :-)
Upvotes: 1