zoran119
zoran119

Reputation: 11307

Compiling and running in different containers

I have a project which compiles to a binary file, and running that binary file exposes some REST APIs.

To compile the project I need docker image A which has the compiler and and all the libraries required to produce the executable. To run the executable (ie. host the service) I can get away with a much smaller image B (just basic linux distro, no need for the compiler).

How does one use docker is such a situation?

Upvotes: 2

Views: 426

Answers (2)

Creek
Creek

Reputation: 212

mkdir local_dir
docker run -dv $PWD/local_dir:/mnt BUILD_CONTAINER

compile code and save it to /mnt in the container. It'll be written to local_dir on your host filesystem and persist after the container is destroyed.

You Should now write a Dockerfile and add a step to copy in the new binary, then build. But for example's sake...

docker run -dv $PWD/local_dir:/mnt PROD_CONTAINER 

Your bin, and everything else in local_dir, will reside in the container at /mnt/

Upvotes: 2

shizhz
shizhz

Reputation: 12501

My thinking for this scenario is that you can prepare two base images:

  • The 1st one, which includes compiler and all libs for building your executable, call it base-image:build
  • The 2nd one, as the base image to build your final image to delivery, call it base-image:runtime

And then break your build process into two steps:

  • Step 1: build your executable inside base-image:build, and then put your executable to some place, like NFS or any registry from where you can fetch it for later use;
  • Step 2: write your Dockerfile which FROM base-image:runtime, fetch your artifact/executable from wherever generated by Step 1, docker build your delivery image, and then docker push to your registry for release.

Hope this could be helpful :-)

Upvotes: 1

Related Questions