Reputation: 381
I have a problem with dealing multiple streams. Consider this case, I have data.csv file whose content is
id,val
1,100
2,75
I want to produce an output like this, half should be 50% of val and we need to append same rows but multiplied by 10.
id,val,half
1,100,50
2,75,37.5
10,100,500
20,75,37.5
I am successful to read csv from file, transform it and process it. I am only able to produce 1 stream. thats is, In the beginning there are only 2 row, so my output is also containing 2 rows. I am able to produce result like this
id,val,half
1,100,50
2,75,37.5
but unable to add extra part. that is whole content to be multiplied by 10.
One solution is very simple, read the content in the file, have it in javascript Array/Object format, process it and write to disk. this do not deal with pipes. I want to write whole things in terms of pipes.
I already have code something like this.
// Initialize file streams where the output will be written to
const forecastcsv = fs.createWriteStream(forecastfile);
fs.createReadStream(argv.file).pipe(parser).pipe(transformer).pipe(stringifier).pipe(forecastcsv);
Unable to think how I can create two streams (one is doing half, other is doing multiply to 10) the join them to achieve double rows in final csv file.
Thanks
Upvotes: 0
Views: 640
Reputation: 381
Thanks @Chris Anderson-MSFT I was having misunderstanding about 1:1, we can emit more data, So here is my transformer which is working. no need to multiple streams.
var transformer = csv.transform(function (record, callback) {
record.half = record.value/2;
var extra = JSON.parse(JSON.stringify(record));
extra.id = extra.id * 10;
extra.value = extra.value * 10;
extra.half = extra.half * 10;
callback.call(this, null, record, extra);
});
Upvotes: 1