RRP
RRP

Reputation: 2853

How to deal with response from a large file in Node.js?

I am dealing with a large file, where I am making a HTTP request to get the data from. The response is being piped over via stream

fs.createReadStream(pathToFile).pipe(res)

My request function looks like the following

function makeRequest(requestData, IP) {
    var options = {
        host: IP,
        port: 5000,
        path: '/read',
        method: 'POST',
        headers: {
            'Content-Type': 'application/json'
        }
    };

    var req = http.request(options, function(res) {
        //res.setEncoding('binary');
        var data = [];
        res.on('data', function(chunk) {
            data.push(chunk);
        });

        res.on('end', function() {
            //fs.appendFileSync(fileName, data);
            var binary = Buffer.concat(data);
            var writeStream = fs.createWriteStream(fileName, { "flags": 'a' });
            writeStream.write(binary);
            writeStream.end();
        });

        res.on('error', function(err){
            console.log("Error during HTTP request");
            console.log(err.message);
        });
    });

    req.write(requestData);
    req.end();
}

I believe since its such a large quantity of data, it affects the memory and the server just crashes. How should i deal with this?

This question is based of another question, I had asked before (Reading large files in Node.js from another server)

Upvotes: 0

Views: 2970

Answers (1)

Jonas Wilms
Jonas Wilms

Reputation: 138557

Your server does not crash because you are requesting such large files or saving them but rather because you do this:

 var data = [];

You cache the whole 1GB (or whatever size your file has) IN THE RAM! And in a very memory consuming way (why not a buffer?). Instead you should write the data directly to the file and dont cache it:

  var writeStream = fs.createWriteStream(fileName, { "flags": 'a' });

   res.on('data', function(chunk) {
        writeStream.write(chunk);
    });

    res.on('end', function() {
        writeStream.end();
    });

That can be actually simplified to:

 res.pipe(fs.createWriteStream(fileName, { "flags": 'a' }))

Through that only a few chunks stay in the RAM and will be deleted after they were written onto the harddrive.

Upvotes: 2

Related Questions