Reputation: 181
I have a server which is observing IoT data and I need to store it to a database so that I can build metrics of some kind.
The data is being received on a websocket connection, one piece of data might look something like this:
{
id: 23,
type: "light"
status: "on"
timestamp: 1607975079
}
There can be a large amount of data being observed and I thought that it may not be a great idea to add each event to the database in real time as it may cause bottlenecks.
Are there strategies with this type of scenarios where I can use an in memory buffer of some kind and then save a chunk of data to the database every 30 minutes?
Is this a sensible approach or are there other ways that this should be handled?
Any suggestions would be appreciated.
Thanks
Upvotes: 1
Views: 2321
Reputation: 97
You can use Redis as a buffer, and create a separated worker to save in bulk the data from the buffer to your database.
The idea is shown in the Docker Voting App: https://github.com/dockersamples/example-voting-app
You can build all the components in node.js, take into account the language mix in the example above is to show Dockers capability.
Upvotes: 2
Reputation: 775
You can create a buffer and add events there. When a certain number of events accumulate upload the buffer to the database.
Pseudocode:
const buffer = [];
event.on('event', data => {
buffer.push(event);
if (buffer.length === 10) {
DataBase
.bulkCreate(buffer);
buffer = [];
}
});
Or you can use Cron:
const buffer = [];
event.on('event', data => {
buffer.push(event);
};
const job = () => {
DataBase
.bulkCreate(buffer);
buffer = [];
};
cron.schedule('*/30 * * * *', () => {
job();
}).start();
Upvotes: 1