Reputation: 4306
I'm generating a CSV file that I'd like to save. It's a bit large, but the code is very simple. I use streams as to prevent out of memory errors, but it's happening regardless. Any tips?
const fs = require('fs');
var noOfRows = 2000000000;
var stream = fs.createWriteStream('myFile.csv', {flags: 'a'});
for (var i=0;i<=noOfRows;i++){
var col = '';
col += i;
stream.write(col)
}
Upvotes: 1
Views: 776
Reputation: 1200
add a drain
eventlistener.
const fs = require("fs");
var noOfRows = 2000000000;
var stream = fs.createWriteStream("myFile.csv", { flags: "a" });
var i = 0;
function write() {
var ok = true;
do {
var data = i + "";
if (i === noOfRows) {
// last time!
stream.write(data);
} else {
// see if we should continue, or wait
// don't pass the callback, because we're not done yet.
ok = stream.write(data);
}
i++;
} while (i<=noOfRows && ok);
if (i < noOfRows) {
// had to stop early!
// write some more once it drains
stream.once("drain", write);
}
}
write();
And noOfRows
is so big, it may cause your .csv
file size out of disk size
Upvotes: 2
Reputation: 178
There is no hard size limit on .csv files. The limit in any scenario would be the file system / hdd size.
The maximum file size of any file on a filesystem is determined by the filesystem itself - not by the file type or filename suffix.
To prevent out memory errors check you file size limit as per your filesystem partition.
Upvotes: 0
Reputation: 499
Your .csv file has too much data to be kept in stream. Streams basically uses your computer's physical memory so it can store only upto the free physical memory. e.g. if your computer has 8GB of RAM of which lets say 6 GB is free then the stream can't store more than 6GB. You can break it up into chunks and then merge it back at the destination later.
Upvotes: 0