Reputation:
I'm building a NodeJS application. The application would receive API calls, and then based on the message received, it will do something.
I need something that can say, handle 1000 API calls within a few minutes and not crash my NodeJS. It'll have to queue the API calls, and then execute what it has to afterwards. If the server crashes, when it restarts, the queue must continue to work.
I'm sorry I'm new to programming and I don't know what are the other alternatives available. I've read up about RabbitMQ, but my employer wants something that is free to implement and refuses to offer me any expertise in this.
Upvotes: 1
Views: 2912
Reputation: 480
Check out Bull. It's a Redis based job queue that does pretty much what you've mentioned - it allows you to add a job to the queue and carry on with your control flow. The job queue then processes items one by one. I've attached a short snippet below, for some context.
const Bull = require('bull');
// This looks for a Redis server running at port 6379 on your local machine by default
const jobQueue = new Bull("my_job_queue")
app.post("/doSomething", (req, res) => {
const { taskObj } = req.body;
// This is non blocking
jobQueue.add(taskObj);
// Then, just return
res.status(200).json({
"message": "We're on it!",
"something else": "We'll get back to you soon :)"
});
});
jobQueue.process((job, done) => {
// This runs per job
const doSomething = (jobData) => {
return new Promise((resolve, reject) => {
console.log(JSON.stringify(jobData));
resolve(true);
})
}
const { data } = job;
await doSomething(data);
// Wait to finish doing something with the data, and then move on
done();
});
Upvotes: 3