shane_00
shane_00

Reputation: 101

Node.js - request big amount of data results in ETIMEDOUT

I do request to api endpoint which contains a lot of data and needs some time to fetch. My request sometime is successful but sometimes I get ETIMEDOUT error. I tried increasing the timeout of the request but this doesn't solve my problem. Is there a way to load the data in chunks or to increase the server timeout?

Upvotes: 0

Views: 866

Answers (3)

Andrey
Andrey

Reputation: 70

As people already mentioned, use Streams.

Suppose a client requests a big file from our server (file.txt in code below). Using streams we send this file in chunks instead of buffering it in memory. Server doesn't consume a lot of RAM and client gets immediate response, everyone is happy.

const fs = require("fs");
const http = require("http");
const server = http.createServer();
const port = 8000;

server.on("request", (req, res) => {
  const src = fs.createReadStream("./file.txt"); // file.txt is some big file
  src.pipe(res);
  fs.readFile("./file.txt", (err, data) => {
    if (err) throw err;
    res.end(data);
  });
});

server.listen(port, () => { `Server is listening on http://localhost:${port}` });

Upvotes: 0

qphi
qphi

Reputation: 121

Using module http with http.request() you can set a timeout like explained here: How to set a timeout on a http.request() in Node?

Note you can load data in chunk res.on('data', ...) event in your callback like :

const req = http.request(options, (res) => {
  res.on('data', (chunk) => {
    console.log(`BODY: ${chunk}`);
  });
  res.on('end', () => {
    console.log('No more data in response.');
  });
});
req.end();

Code and more details at : https://nodejs.org/api/http.html#http_http_request_url_options_callback

Upvotes: 1

Rupjyoti
Rupjyoti

Reputation: 399

Yes, there is a way to send response in chunks. You need to use Node.js streams.

Below is an example.

http.createServer(function(req, res) {
  // The filename is simple the local directory and tacks on the requested url
  var filename = __dirname+req.url;

  // This line opens the file as a readable stream
  var readStream = fs.createReadStream(filename);

  // This will wait until we know the readable stream is actually valid before piping
  readStream.on('open', function () {
  // This just pipes the read stream to the response object (which goes to the client)
  **readStream.pipe(res);**
  });

  // This catches any errors that happen while creating the readable stream (usually invalid names)
  readStream.on('error', function(err) {
    res.end(err);
      });
  }).listen(8080);

The above code is from Node.js documentation.

Focus on the part "readStream.pipe(res);"

The res is being send continuously along with reading the file. If the file is large, it will still be able to send slowly and continuously to the client.

Check documentation,

https://nodejs.org/en/knowledge/advanced/streams/how-to-use-fs-create-read-stream/

Similarly, you can also allow client to stream large video file, lets say 750MB with this stream process. Just, there are some more complications to handle videos.

Upvotes: 0

Related Questions