Stewart
Stewart

Reputation: 1919

What is the most memory-friendly way to save a large file in Node JS?

I have a Node script that uses the fs.writeFileSync() method to write a ~50MB of JSON to a file on disk.

It works fine on my powerful laptop, but when I run it in a cloud VM with 1GB RAM, it crashes with error 137, which seems to indicate excessive memory usage.

Is there a more memory-friendly way to save a file to disk?

This is the error I get:

20 error code ELIFECYCLE
21 error errno 137

Upvotes: 1

Views: 1017

Answers (2)

Stewart
Stewart

Reputation: 1919

I was asking the wrong question. The right question was "how do I avoid my script crashing without upgrading my RAM?".

The answer is by using a swap file in my Linux server to work around the limited memory. I was not using a swap file, so my RAM was quickly exhausted, but I solved my problem by adding a 2GB swap file. No new code needed.

https://www.answertopia.com/ubuntu/adding-and-managing-ubuntu-swap-space/

Upvotes: 0

eol
eol

Reputation: 24565

There's a nice library called JSONStream which allows you to stream your data set to e.g. a file, instead of serializing it with JSON.stringify and thus not consuming a lot of memory. Something like this should work:

const JSONStream = require('JSONStream');
const fs = require('fs');

const records = [
    {id: 1, name: "SomeName123"},
    {id: 2, name: "SomeName123"},
    {id: 3, name: "SomeName123"},
    {id: 4, name: "SomeName123"},
    {id: 5, name: "SomeName123"}
    // a lot more records
];


const transformStream = JSONStream.stringify();
const outputStream = fs.createWriteStream(__dirname + "/result.json");

transformStream.pipe(outputStream);

records.forEach(transformStream.write);

transformStream.end();

outputStream.on(
    "finish",
    () => {
        console.log("Done");
    }
);

Upvotes: 4

Related Questions