Nrgyzer
Nrgyzer

Reputation: 1075

Very simply Node.js application uses multiple GB memory

I'm having a very simple Node.js application:

const fs = require('fs')
const axios = require('axios')

async function test() {
  const {data} = await axios.get('http://speedtest.ftp.otenet.gr/files/test100k.db')

  fs.mkdirSync('./data', {
    recursive: true
  })

  for (let i = 0; i < 300_000; i++) {
    fs.writeFileSync(`./data/${Date.now()}`, data)
    console.log(i)
  }
}

test()

After one minute or so the application reaches 15 GB memory or so (I had to cancel it because of insufficient memory). I wonder about the reason. I'm simply downloading a 100k file and save it 300 000 times. Why is the memory increasing so much and is there any solution to prevent this issue (except using streams)?

Upvotes: 0

Views: 616

Answers (1)

jfriend00
jfriend00

Reputation: 708016

Welcome to a garbage collected system.

You are running code non-stop in a massive loop and leaving no cycles for garbage collection so it will use as much memory as it can before interrupting your code flow to try to reclaim some memory with garbage collection. The garbage collector normally waits until the interpreter is not busy and THEN does garbage collection. But your code is in a giant loop where each iteration of the loop uses some memory so you never give the GC any slack time where it would like to run.

FYI, as this is obviously a make-up test program, to advise on how to best solve problems like this, we would need to see a real problem and real code so we can suggest the best solution to that actual code.

Upvotes: 1

Related Questions