user2457035
user2457035

Reputation: 81

fs.readstream to read an object and then pipe to writeable to file?

Currently I have a module pulling sql results like this:

[{ID: 'test', NAME: 'stack'},{ID: 'test2', NAME: 'stack'}]

I want to just literally have that written to file so i can read it as an object later, but i want to write it by stream because some of the objects are really really huge and keeping them in memory isnt working anymore.

I am using mssql https://www.npmjs.org/package/mssql

and I am stuck at here:

    request.on('recordset', function(result) {
        console.log(result);
    });

how do I stream this out to a writable stream? I see options for object mode but i cant seem to figure out how to set it?

    request.on('recordset', function(result) {
        var readable = fs.createReadStream(result),
            writable = fs.createWriteStream("loadedreports/bot"+x[6]);
        readable.pipe(writable);
    });

this just errors because createReadStream must be a filepath...

am I on the right track here or do I need to do something else?

Upvotes: 0

Views: 2099

Answers (1)

David Losert
David Losert

Reputation: 4802

You´re almost on the right track: You just dont need a readable stream, since your data already arrives in chunks.

Then, you can just create the writeable stream OUTSIDE of the actual 'recordset'-Event, else you would create a new stream everytime you get a new chunk (and this is not what you want).

Try it like this:

 var writable = fs.createWriteStream("loadedreports/bot"+x[6]);
 request.on('recordset', function(result) {
    writable.write(result);
 });

EDIT

If the recordset is already too big, use the row-Event:

   request.on('row', function(row) {
   // Same here
   });

Upvotes: 1

Related Questions