Reputation: 139
I have a folder that contains code + weights of neural networks + annoy indices and takes approximately 16 GB. The folder also contains a Dockerfile
and a requirements.txt
. When I call docker build .
Sending build context to Docker daemon
takes 16 GB of spaceADD . /model
takes 32 GB of space. Eventually, it frees 16 GBHence, I need at least 48 GB of storage for the build to be successful. Here is my bare-minimum dockerfile
FROM python:3.8.5
WORKDIR /model
RUN pip install -r requirements.txt
ADD . /model
CMD ["python", "deploy.py"]
How can I fix this? It appears that the data is getting copied from the current directory multiple times.
Upvotes: 0
Views: 811
Reputation: 6372
The Docker client sends by default the entire directory in which the Dockerfile is located as build context
to the Docker daemon.
You can either try to eliminate what it is not necessary from the context with a .dockerignore or bring that data into the containers via volumes rather than having it copied at build-time.
Something like this:
docker run -v <path-to-your-model-on-the-host-machine>:<path-where-you-want-the-data-inside-the-container> ...
Beware that this will allow the container to alter the data from your host machine.
Upvotes: 1