Reputation: 5083
Most/all examples I see online usually copy package.json
into the image and then run npm install
within the image. Is there a deal breaker reason for not running npm install
from outside on the build server and then just copying everything including the node_modules/
folder?
My main motivation for doing this is that, we are using a private npm registry, with security, and running npm
from within an image, we would need to figure out how to securely embed credentials. Also, we are using yarn, and we could just leverage the yarn cache across projects if yarn runs on the build server. I suppose there's workarounds for these, but running yarn/npm from the build server where everything is already set up seems very convenient.
thanks
Upvotes: 4
Views: 4623
Reputation: 5089
Public Dockerfiles out there are trying to provide generalized solution.
Having dependencies coded in package.json
makes it possible to share only one Dockerfile
and not depend on anything not public available.
But at runtime Docker does not care how files got to container. So this is up to you, how you push all needed files to your container.
P.S. Consider layering. If you copy stuff under node_modules/
, do it in one step, by that only one layer is used.
Upvotes: 5