Yafim Dziuko
Yafim Dziuko

Reputation: 83

Writing large file in Nodejs using fs.createWriteStream

I have function, which in perfect world should create an huge 1M lines file. Here it is:

    const fileWriteStream = fs.createWriteStream(path.resolve(filePath));
    let ableToWrite = true;
    for (let i = 0; i < 1e6; i++) {
        if (ableToWrite) {
            ableToWrite = fileWriteStream.write(`.testClass${itr}-${i%2} { background: red } \n`);
        } else {
            fileWriteStream.once('drain', () => {
                ableToWrite = fileWriteStream.write(`.testClass${itr}-${i%2} { background: red } \n`);
            })
        }
    }

Unfortunetly for me, i'm getting the following error pretty fast:

MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 drain listeners added. Use emitter.setMaxListeners() to increase limit

I don't really want to increase listeners count for this function. What is correct way to write such a big file using streams?

Many thanks!

Upvotes: 0

Views: 3574

Answers (1)

Bergi
Bergi

Reputation: 664195

The easiest way to asynchronously continue when the stream drains is to use await in the loop with a promise:

const fileWriteStream = fs.createWriteStream(path.resolve(filePath));
for (let i = 0; i < 1e6; i++) {
    const ableToWrite = fileWriteStream.write(`.testClass${itr}-${i%2} { background: red } \n`);
    if (!ableToWrite) {
        await new Promise(resolve => {
            fileWriteStream.once('drain', resolve);
        });
    }
}

The alternative is using a recursive approach instead of a loop:

function go(i) {
    if (i >= 1e6) return;
    const ableToWrite = fileWriteStream.write(`.testClass${itr}-${i%2} { background: red } \n`);
    if (ableToWrite)
        go(i+1);
    else
        fileWriteStream.once('drain', () => {
            go(i+1);
        });
}
const fileWriteStream = fs.createWriteStream(path.resolve(filePath));
go(0);

Upvotes: 5

Related Questions