icelemon
icelemon

Reputation: 865

How to process Array of JSON Stream Response in node.js server streaming http resquest

The stream response is form of

[{
  "id":0,
  "name":name0
}
,
{
  "id":1,
  "name":name1
}
]

if I use node-fetch stream feature to fetch, iterate the response.body, the chunk data is cut the object randomly. And I can't parse on it. I guess node-fetch not support array of json, and can't recognize [, ].

How to process the streaming array of json? Or any other 3rd party library? sample code:

const fetch = require('node-fetch');

async function main() {
  const response = await fetch(url);
  try {
    for await (const chunk of response.body) {
      console.log('----start')
      console.dir(JSON.parse(chunk.toString()));
      console.log('----end')}
  } catch (err) {
    console.error(err.stack);
  }
}

main()

Upvotes: 0

Views: 1253

Answers (1)

jorgenkg
jorgenkg

Reputation: 4275

An approach to stream parse an external JSON source is to combine node-fetch with stream-json to parse the incoming data regardless of how the (string) data is chunked.

import util from "util";
import stream from "stream";
import StreamArray from "stream-json/streamers/StreamArray.js";
import fetch from "node-fetch";

const response = await fetch(url);

await util.promisify(stream.pipeline)(
  response.body,
  StreamArray.withParser(),
  async function( parsedArrayEntriesIterable ){
    for await (const {key: arrIndex, value: arrElem} of parsedArrayEntriesIterable) {
      console.log("Parsed array element:", arrElem);
    }
  }
)

stream.pipeline() with async function require NodeJS >= v13.10

Upvotes: 1

Related Questions