ffflabs
ffflabs

Reputation: 17481

Node Streams, wrap array as object

I have a metadata object in the form

{ 
    filename: "hugearray.json",
    author: "amenadiel",
    date: "2014-07-11",
    introduction: "A huge ass array I want to send to the browser"
}

That hugearray.json is a text file in my folder which contains, as its name implies, an array of potentially infinite elements.

[
    [14, 17, 25, 38, 49],
    [14, 41, 54, 57, 58],
    [29, 33, 39, 53, 59],
    ...
    [03, 14, 18, 34, 37],
    [03, 07, 14, 29, 33],
    [05, 16, 19, 30, 49]
]

What I want to achieve is to output to the browser an object which is the original object, with the extra key 'content' which is the huge array

{ 
    filename: "hugearray.json",
    author: "amenadiel",
    date: "2014-07-11",
    introduction: "A huge ass array I want to send to the browser",
    content: [
                [14, 17, 25, 38, 49],
                ...
                [05, 16, 19, 30, 49]
             ]
}

But since I don't know the array size, I don't want to store the whole thing in memory before outputting, so I thought of using streams. I can stream the array fine with

var readStream = fs.createReadStream("hugearray.json");

readStream.on('open', function () {
    readStream.pipe(res);
});

And of course I can send the metadata object to the res with

res.json(metadata);

And I've tried deconstructing metadata, writing each key : value pair and leaving a content key open, then to pipe the file results, then closing the curly braces. It doesn't seem to work:

{ 
    filename: "hugearray.json",
    author: "amenadiel",
    date: "2014-07-11",
    introduction: "A huge ass array I want to send to the browser",
    content:
}[
    [14, 17, 25, 38, 49],
    [14, 41, 54, 57, 58],
    [29, 33, 39, 53, 59],
    ...
    [03, 14, 18, 34, 37],
    [03, 07, 14, 29, 33],
    [05, 16, 19, 30, 49]
]

I guess I need to wrap the stream in my metadata content key instead of trying to output json and stream into the result. ¿Any ideas?

Upvotes: 1

Views: 975

Answers (1)

ffflabs
ffflabs

Reputation: 17481

Well, my question went unnoticed but made me win the Tumbleweed badge. It's something.

I kept investigating and I came out with a solution. I was hoping to find a one liner, but this one works too and so far I've been able to output several MBs to the browser without noticeable performance hit in my node process.

This is the method I used

app.get('/node/arraystream', function (req, res) {
    var readStream = fs.createReadStream("../../temp/bigarray.json");
    var myObject = {
        filename: "hugearray.json",
        author: "amenadiel",
        date: "2014-07-11",
        introduction: "A huge ass array I want to send to the browser"
    };

    readStream.on('open', function () {
        console.log('readStream open');
        var myObjectstr = JSON.stringify(myObject);
        res.write(myObjectstr.substring(0, myObjectstr.length - 1) + ',"content":');
    });

    readStream.on('error', function (err) {
        console.log('readStream error', err);
        throw err;
    });

    readStream.on('close', function () {
        console.log('readStream closed');
        readStream.destroy();
        res.write('}');
        res.end();

    });

    readStream.on('data', function (data) {
        console.log('readStream received data', data.length);
        var buf = new Buffer(data, 'ascii');
        res.write(buf);
    });

});

Basically, instead of turning my object into a stream, I turned my array into a buffer.

Upvotes: 2

Related Questions