jordan.baucke
jordan.baucke

Reputation: 4328

Mongoose QueryStream new results

I am trying to setup MongooseJS to push out the whole collection (or just the newest item) when new documents are inserted into the collection by another application.

I assumed QueryStream was the way to go.

However, when I start my simple application, it reads out the collection once, and closes it.

When I insert a new document nothing happens (assuming the connection is no longer open and looking for new results...?)

var Orders = db.model('orders', OrderSchema);

var stream = Orders.find().stream();

stream.on('data', function(doc){
    console.log('New item!');
    console.log(doc);
}).on('error', function (error){
    console.log(error);
}).on('close', function () {
    console.log('closed');
});

Immediately prints all the items that are currently in the orders collection, and than prints "closed". Shouldn't the "Stream" remain open printing new data when the collection changes?

What am I not understanding about the MongooseJS QueryStream?

Ps. my goal is to eventually emit an updated collection via socket.io as demonstrated here: Mongoose stream return few results first time

Upvotes: 2

Views: 2721

Answers (2)

durum
durum

Reputation: 3404

You need some mark (a timestamp or just a plain number) to don't get all collection every time you start streaming. As example, if you insert a timestamp in the collection entries, you could use:

    var filter = { "timestamp":{"$gte":Date.now()}};
    var stream = Orders.find(filter).tailable().stream();

Think about the mongoDB streaming as a tail -f command in bash.

Upvotes: 4

jordan.baucke
jordan.baucke

Reputation: 4328

I discovered that in order for this method to work I needed to change my collection to a capped collection:

var OrderSchema = new Mongoose.Schema({...
}, { capped: { size: 10, max: 10, autoIndexId: true }});

var Orders = db.model('orders', OrderSchema);

var stream = Orders.find().tailable().stream();

stream.on('data', function(doc){
    console.log('New item!');
    console.log(doc);
}).on('error', function (error){
    console.log(error);
}).on('close', function () {
    console.log('closed');
});

This works because I can now treat the MongoDB collection like something of a message queue, which is continuously updated.

Strangely enough when I wrap this inside of a SocketIO event I get multiples of the same documents which makes me think there is still something I'm not doing exactly right...

Upvotes: 5

Related Questions