Rahul
Rahul

Reputation: 687

How to implement an event based consumer which can process max n messages at any given time?

We have a scenario in the application where user uploads a file from UI. The file along with some metadata is saved to DB. Currently, we are running a scheduler which will poll the DB every 15 secs to see if any files needs to be processed. It can process max upto 10 files at a time.

Currently, the APIs and scheduler are both in the same service deployed on GCP. We are using Spring Webflux. The application runs on 2 pods and we are managing the scheduler to run only on 1 pod at a time by using Shedlock.

We want to move the file processing scheduler part into a separate module/service/component. and change it to an event based mechanism. We want the producer(on existing service) to send a fileId to kafka or GCP Pub/Sub after saving the file details in DB. Consumer(on new service) should read the message and process it only if number of files under processing < 10 Once the number of files under processing becomes < 10, a new fileId should be picked(preferably from memory rather than going to DB) and processed.

We can use a scheduler to handle this scenario but just wondering if there is a better way to do it. We would like to avoid using any new component like a cache to avoid additional expenses.

What is the best way to implement it?

Thanks in advance!

Upvotes: 0

Views: 36

Answers (0)

Related Questions