Rocky
Rocky

Reputation: 431

how to use read and write stream of csv-parse

hello guys i m new over here and got confuse how i can use readstream and write stream right now i tried this(i m using this library https://www.npmjs.com/package/csv-parse)

    fs.createReadStream(path.join(__dirname,"../demofile/Customers.csv"))
        .pipe(parse(options))
        .on('data', function(csvrow) {
            console.log(csvrow);
            //do something with csvrow
            output.push(csvrow);        
        })
        .on('end',function() {
        //do something wiht csvData
        console.log(csvData);
        });

but i don't think it correct way read and write stream of csv-parse can anyone suggest me how i can use read and write with above code. basically i want to read a csv file and write it into another csv file reason for using read and write stream is because my file size is about 2gb

Upvotes: 1

Views: 10959

Answers (1)

MrfksIV
MrfksIV

Reputation: 930

The following code uses streams to add a new column. The file I have used was about 500MB and the maximum utilized RAM was less than 50MB. Note that instead of the csv-parse I have imported csv which is actually an 'umbrella' module that includes csv-parse, csv-generate, stream-transform and csv-stringify.

const fs = require('fs');
const csv = require('csv');
const path = require('path');
const EOL = require('os').EOL;

FILE = path.join(__dirname, 'IN.csv');
NEW_FILE = path.join(__dirname, 'OUT.csv');

const readStream = fs.createReadStream(FILE);
const writeStream = fs.createWriteStream(NEW_FILE);

const parse = csv.parse();

const transform = csv.transform((row, cb) => {

    row.push('NEW_COL');
    result = row.join(',') + EOL;
    cb(null, result);
});

readStream.pipe(parse).pipe(transform).pipe(writeStream);

Upvotes: 6

Related Questions