Reputation: 371
it looks to me like I have a concurrency problem in nodejs. I am aware of the fact that this is not supposed to be possible.
I am processing data by the line from a file and writing it to another file also in lines. In the output file I notice that lines are being overwritten meaning that every now and then I see lines in the output that seem to be overwritten half way by other lines. I read the data from a read stream. it looks roughly like this:
let iStream = fs.createReadStream(inputFile);
let oStream = fs.createWriteStream(outputFile);
let remaining = '';
iStream.on('data',(data)=>{
remaining += data;
let line = remaining.split(/\r?\n/);
let lines = line.length;
if(lines > 0) {
remaining = line[lines - 1];
line.length = lines - 1;
line.forEach((curr)=>{
oStream.write(processLine(curr));
});
});
Is there any possibility of this scheme producing write failures or do I have to look somewhere else ?
Upvotes: 0
Views: 1103
Reputation: 707158
This appears to be a buffer overflow issue. You are likely overflowing the write buffer, but not paying any attention to flow control.
You can either pass a callback into .write()
and only proceed with the next write when that callback is called or you can pay attention to the return value from .write()
and when it returns false
, you have to then wait for the drain
event on the stream before writing some more.
Another approach would be to write a transform stream and then use .pipe()
and let the streaming infrastructure manage the flow control for you.
Upvotes: 1