Reputation: 713
I need to continuously update data on the client based on DB changes. I'm thinking about having a 5 second interval function that repeatedly gathers all the DB information and use Socket.IO to emit the data to the client.
Currently, I'm doing this on the client itself without socket.io, just repeatedly doing a REST call to the server which then handles the data.
My question is: Are either of these methods efficient or inefficient and is there a better solution to solve what I'm trying to achieve?
Upvotes: 3
Views: 2760
Reputation: 1421
Ryan, you can try using MongoDB's collection.watch() which fires an event every time an update is made to a collection. You would need to do that within the socket connection event for it to work though. Something along these lines:
io.sockets.on('connection', function(socket) {
// when the socket is connected, start listening to MongoDB
const MongoClient = require("mongodb").MongoClient;
MongoClient.connect("mongodb://192.168.1.201")
.then(client => {
console.log("Connected correctly to server");
// specify db and collections
const db = client.db("your_db");
const collection = db.collection("your_collection");
const changeStream = collection.watch();
// start listening to changes
changeStream.on("change", function(change) {
console.log(change);
// this is where you can fire the socket.emit('the_change', change)
});
})
.catch(err => {
console.error(err);
});
});
Note that using this approach will require you to set up a replica set. You can follow those instructions or use a Dockerised replica set such as this one.
Upvotes: 1
Reputation: 717
I need more details to make sure but it doesn't sound like a good solution.
If the data you need does not change rapidly, like let's say in seconds, each of your connection still polling every 5 seconds and that's kind of wasting.
In that case you might just trigger an event where the data got changed, then you can push the message through sockets that are active.
Upvotes: 0