lnx
lnx

Reputation: 378

Avoid reinstallation of packages on every Docker build

When I build an image of new python application needing TensorFlow (import tensorflow), every time docker installs TensorFlow of 520 MB.

How to avoid this? Means download tensorflow only once and use it while building many images?

Dockerfile

FROM python:3

WORKDIR /usr/src/app

COPY model.py .
COPY model_08015_07680.h5 .
COPY requirements.txt .
COPY images .
COPY labels.txt .
COPY test_run.py .

RUN pip install --no-cache-dir -r requirements.txt

CMD ["python","./test_run.py"]

requirements.txt

numpy
opencv-python
tensorflow

Upvotes: 3

Views: 2221

Answers (2)

paltaa
paltaa

Reputation: 3244

You don't need to copy each file separately, this is not optimal.

Also, remember docker is built by layers, so every line that seems likely to change goes to the bottom.

FROM python:3

WORKDIR /usr/src/app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
#Copy everything
COPY . .    
CMD ["python","./test_run.py"]

Upvotes: 1

nischay goyal
nischay goyal

Reputation: 3480

Please use the below Dockerfile which is bit optimised as it will not install dependencies again and again, until/unless you are changing requirements.txt

FROM python:3

WORKDIR /usr/src/app

#Copy Requirements
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

#Copy everything later, as below parts will be changing and above part will be used from cache
COPY model.py .
COPY model_08015_07680.h5 .
COPY images .
COPY labels.txt .
COPY test_run.py .


CMD ["python","./test_run.py"]

Upvotes: 0

Related Questions