vamsi reddy
vamsi reddy

Reputation: 143

Using buffers and streams to write the data

I need a node.js program for the following I have an huge JSON data which have an thousands of records I need to copy to the another file using streams. But there are 3 key value pairs but i need to copy only the one key value (i.e.) I have name, age and city, but I need in another file only the names of the json record .

Can you suggest me if there are any alternatives. Also the file size is too big consists of thousands of records.

sample data taken as

[  
   {  
      "name":"John",
      "age":31,
      "city":"New York"
   },
   {  
      "name":"vamsi",
      "age":31,
      "city":"New York"
   },
   {  
      "name":"loga",
      "age":31,
      "city":"New York"
   },
   {  
      "name":"krishna",
      "age":31,
      "city":"New York"
   },
   {  
      "name":"kishore",
      "age":31,
      "city":"New York"
   },
   {  
      "name":"reddy",
      "age":31,
      "city":"New York"
   }
]

Upvotes: 3

Views: 881

Answers (1)

Michał Karpacki
Michał Karpacki

Reputation: 2658

Should be easy using streams:

First install some modules for streaming JSON and transformation

npm install --save JSONStream scramjet

Then write the code:

const scramjet = require("scramjet");
const fs = require("fs");
const JSONStream = require("JSONStream")

fs.createReadStream(filename)             // open the file
    .pipe(JSONStream.parse('*'))          // parse JSON array to object stream
    .pipe(new scramjet.DataStream)        // pipe for transformation
    .map(({name}) => name)                 // extract "name" field from each object
    .toJSONArray()                        // create an array stream
    .pipe(fs.createWriteStream(outname)); // write to output.

See the docs for more info:

Upvotes: 1

Related Questions