Reputation: 722
I'm using the node package 'csv', and I've run into an issue in the most obvious use case for the package and am not sure how to proceed. Package repo: https://github.com/wdavidw/node-csv
I need to, 1: read in a csv file, and perform a single operator for each line. 2: after the entire csv file has been read, perform an action and write the result to a new csv file.
I'm stuck on part one. Here is what I have after concatenating a bunch of (seemingly inconsistent) examples together.
var fs = require('fs');
var csv = require('csv');
var transform = require('stream-transform');
var outputMap = {};
var baseStream = fs.createReadStream(__dirname + '/locationTaxonomy.csv');
baseStream
.pipe(csv.parse())
.pipe(csv.transform(function(record){
outputMap[record[2]] = record;
return record;
}));
The preceding only gets through the first ~16 lines of the csv file, and then halts. If I pipe baseStream directly to process.stdout, the file is read in completion. Any ideas as to how to accomplish this seemingly trivial task?
Upvotes: 3
Views: 2395
Reputation: 722
The return statement in the transform stream handler caused the stream to halt. Removing it allowed the complete csv file to be read
var fs = require('fs');
var csv = require('csv');
var transform = require('stream-transform');
var outputMap = {};
var baseStream = fs.createReadStream(__dirname + '/locationTaxonomy.csv');
baseStream
.pipe(csv.parse())
.pipe(csv.transform(function(record){
outputMap[record[2]] = record;
// return record; // remove this line
}));
Upvotes: 4