ram12393
ram12393

Reputation: 1258

JavaScript heap out of memory in Docker image run

Usually I run my application as npm run dev and in the package.json file has script like below

"scripts": {
        "dev": "nodemon server.ts",
    }

here everything working fine.

I'm build docker image based on below Dockerfile

FROM node:14.17-alpine

RUN npm i -g [email protected]


RUN npm i -g nodemon

RUN apk add g++ make python


WORKDIR /app 

ADD package*.json ./

RUN npm install

ADD . .

CMD npm run dev

I can able to successfully build the image using

 docker build --tag test-backend .

and when I this image

docker run  -it -p 3003:3003 test-backend

I'm facing javascript heap memory outage

<--- Last few GCs --->

[31:0x55effca3d8e0]    57856 ms: Mark-sweep (reduce) 989.3 (996.6) -> 988.2 (997.9) MB, 1275.1 / 0.0 ms  (average mu = 0.173, current mu = 0.035) allocation failure scavenge might not succeed
[31:0x55effca3d8e0]    59100 ms: Mark-sweep (reduce) 989.3 (999.9) -> 988.5 (999.4) MB, 1206.7 / 0.0 ms  (average mu = 0.107, current mu = 0.030) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
Aborted
[nodemon] app crashed - waiting for file changes before starting...

what is the reason behing this cause ??

Upvotes: 23

Views: 49068

Answers (3)

Mahesh Ahire
Mahesh Ahire

Reputation: 71

---- For non linux users ( Windows specifically ) ----

Add

ENV GENERATE_SOURCEMAP=false

in your docker file. This worked for me.

Thanks!

Upvotes: 0

Kerem atam
Kerem atam

Reputation: 2777

For Mac users out there! After I add below directives for the environment variables, I needed to increase from GUI (swap and memory):

ENV GENERATE_SOURCEMAP=false
ENV NODE_OPTIONS=--max-old-space-size=16384

enter image description here

Upvotes: 12

dimaslz
dimaslz

Reputation: 515

I am trying to fix the same problem. Basically, the problem comes because the process is getting more memory allowed by the system. For example, if your machine has 2GB of memory, the process overloads the memory available.

This kind of problem, I solved by adding a swap file to the machine to help a bit the memory (Maybe this article can help you: https://tecadmin.net/linux-create-swap/) and, setup this env ENV NODE_OPTIONS=--max_old_space_size=2048 in your Dockerfile to limit the memory will be used by the node process.

Instead of 2048, use the same value as a memory you have in your machine.

This setting works for me. I hope these comments can help you.

Upvotes: 46

Related Questions