Reputation: 4411
There are a lot of people online asking this same question in different ways but there is no clear answer. Can anybody understand enough to explain why a docker build
fails when package-lock.json
file exists in the application, but runs successfully when it is not? Seemingly it is related to npm but it is not clear.
Everybody says delete the package-lock.json
, but it is there for a reason.
Note: npm install
works fine on my local machine, just fails in docker container.
If I have this Dockerfile:
# First Stage: Builder
FROM node:13.12.0-alpine AS build
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
And run this:
docker build -t container-tag ./
I get this:
npm WARN tar ENOENT: no such file or directory, open '/app/node_modules/.staging/eventsource-c2615740/example/index.html'
npm WARN tar ENOENT: no such file or directory, open '/app/node_modules/.staging/eventsource-c2615740/example/sse-client.js'
npm WARN tar ENOENT: no such file or directory, open '/app/node_modules/.staging/react-router-a14663ae/README.md'
But this Dockerfile will run successfully:
# First Stage: Builder
FROM node:13.12.0-alpine AS build
WORKDIR /app
COPY package.json ./ #<-------- note that there is no star here
RUN npm install
COPY . .
RUN npm run build
Upvotes: 17
Views: 23456
Reputation: 1557
From your question:
Note: npm install works fine on my local machine, just fails in docker container
If you are using npm install
, you are not sure to have the same version of dependencies.
For having a reproducible environment, without unexpected issues because of different version of dependencies, you'd rather use npm ci
(clean-install):
This command is similar to npm-install, except it’s meant to be used in automated environments such as test platforms, continuous integration, and deployment – or any situation where you want to make sure you’re doing a clean install of your dependencies. It can be significantly faster than a regular npm install by skipping certain user-oriented features. It is also more strict than a regular install, which can help catch errors or inconsistencies caused by the incrementally-installed local environments of most npm users.
In short, the main differences between using npm install and npm ci are:
- The project must have an existing package-lock.json or npm-shrinkwrap.json.
- If dependencies in the package lock do not match those in package.json, npm ci will exit with an error, instead of updating the package lock.
- npm ci can only install entire projects at a time: individual dependencies cannot be added with this command.
- If a node_modules is already present, it will be automatically removed before npm ci begins its install.
- It will never write to package.json or any of the package-locks: installs are essentially frozen.
A Fabian Gander's article gives further clarification about the npm install
and npm ci
tools and provides advice on when to use each one. The below table is from that source:
cases | npm install | npm ci
--------------------------------------|-------------|-------------
needs package.json | no | yes
needs package-lock.json | no | yes
installs from package.json | yes | no
installs from package-lock.json | no | yes
compares both | no | yes
updates loose package versions | yes | no
updates loose dependencies | yes | no
writes to package.json | yes | no
writes to package-lock.json | yes | no
deletes node_modules before install | no | yes
used for installing separate package | yes | no
should be used on build systems / CI | no | yes
can be used for development | yes | yes
reproducible installs | no | yes
This is why package-lock.json is there, to be available for tools like npm ci
.
After having a reproducible environment, if this doesn't fix your issue, you need to keep investigating, but IMO it should be the first step.
Upvotes: 21
Reputation: 176
Some reasons your local build succeeded but the Docker build failed could have been (in order of likelihood)
node_modules
folder with the node_modules
folder from your host, because you didn't .dockerignore
node_modules
and issued a COPY / ADD command when node_modules
exists in .
on the hostHowever, I can't explain why omitting the package-lock.json
from the COPY would then make the build work. So the problem could also have involved:
package-lock.json
, you npm install
-ed locally under a different version of node than specified in your Dockerfilepackage-lock.json
, you built locally under a different operating system than Alpine Linuxpackage-lock.json
, you npm install
-ed locally under a different version of npm from the Docker container, it may have treated the lockfile relationship differentlyAll of these actions potentially will cause a package-lock.json
to be generated that may cause an npm install
(and more likely cause a npm ci
) in the container to fail. I'm not sure why these would cause the specific errors you posted though.
If these reasons could be the issue, the proper solution to the problem is surely to do all your npm manipulation (including your generation and manipulation of package.json
and package-lock.json
) inside the same specification of docker container that you intend to ship the code in, and work out a way to commit the results to source control from there. This can be complicated by issues like there being several needs to build your node_modules
in the same environment you push source-code changes from (for instance a build step whose results need pushing into the container, or a git hook that needs installing). I've not yet seen a perfect solution to this issue
Upvotes: 3