Reputation: 1
I have a spring KafkaListener in a spring boot project:
@KafkaListener(topics = "topic-one", groupId = "groupone")
public void listen(CustomerDetails customerDetails) {
if(customerDetails.getCertainDetails!= null && !customerDetails.getCertainDetails.isEmpty()) {
dbInsert;
} else {
log.info(customerDetails.toString)
} }
This listener will be receiving one million plus messages a day. How do i ensure that i dont run into concurrency issue while too many messages are coming in and db insertion? Or i do not need to worry about it? Is there a better solution for the above code approach?
Upvotes: 0
Views: 578
Reputation: 191738
How do i ensure that i dont run into concurrency issue while too many messages are coming in and db insertion
Unless your databse client runs asynchronously, you likely would not run into that problem. The KafkaListener is blocking, and you configure your own max.poll.records
setting in Kafka consumer properties to handle the backpressure; you would never be adding more records than that at once.
Are you currently seeing anything that indicates that would be the fact?
Is there a better solution
In general, yes. It requires you to not manage Kafka Consumers on your own and use a Kafka Connect sink for your respective database.
Upvotes: 1