Pawel
Pawel

Reputation: 494

Spring Boot Kafka - Kafka metrics not available in /actuator/prometheus

I would like to monitor Kafka metrics but unfortunately nothing related to Kafka is present under /actuator/prometheus endpoint. Is anything missing in my setup?

Application dependencies: Kotlin 1.4.31, Spring Boot 2.3.9, Spring Kafka 2.6.7, Reactor Kafka 1.2.5, Kafka Clients 2.5.1

Application config:

    management:   
      server:
        port: 8081   
      endpoints:
        web:
          exposure:
            include: health,info,metrics,prometheus
    
    spring:
      jmx:
        enabled: true
      kafka:
        bootstrap-servers: ...
        consumer:
          group-id: my-service
          key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
        listener:
          ack-mode: manual
        ssl:
          key-store-location: ...
          key-store-password: ...
        security:
          protocol: SSL

My receivers look like:

    @Bean
    fun someEventReceiver(): SomeEventReceiver =
        KafkaReceiver.create(
            ReceiverOptions.create<String, SomeEvent>(kafkaProperties.buildConsumerProperties())
                .withValueDeserializer(SomeEvenDeserializer())
                .subscription(listOf(serviceProperties.kafka.topics.someevent))
        )

And listener:

    @EventListener(ApplicationStartedEvent::class)
    fun onSomeEvent() {
        someEventReceiver
            .receive()
            .groupBy { it.receiverOffset().topicPartition() }
            .publishOn(Schedulers.boundedElastic())
            .flatMap { someEvent ->
                someEvent
                    .publishOn(Schedulers.boundedElastic())
                    .delayUntil(::handleEvent)
                    .doOnNext { it.receiverOffset().acknowledge() }
                    .retryWhen(Retry.backoff(10, Duration.ofMillis(100)))
            }
            .retryWhen(Retry.indefinitely())
            .subscribe()
    }

Upvotes: 4

Views: 5021

Answers (3)

Christos
Christos

Reputation: 934

reactor-kafka is now integrated with micrometer (since v1.3.17).

Registering a MicrometerConsumerListener with a KafkaReceiver should be as simple as:

MeterRegistry registry = new SimpleMeterRegistry();
MicrometerConsumerListener consumerListener = new MicrometerConsumerListener(registry);

Map<String, Object> consumerProperties = getConsumerProperties();
ReceiverOptions<String, String> receiverOptions = ReceiverOptions.create(consumerProperties);
receiverOptions.consumerListener(consumerListener);

KafkaReceiver<String, String> receiver = KafkaReceiver.create(receiverOptions);

similarly for KafkaSender:

MicrometerProducerListener producerListener = new MicrometerProducerListener(registry);

Map<String, Object> producerProperties = getConsumerProperties();
SenderOptions<String, String> producerOptions = SenderOptions.create(producerProperties);
producerOptions.producerListener(producerListener);

KafkaSender<String, String> kafkaSender = KafkaSender.create(producerOptions);

Upvotes: 4

Pawel
Pawel

Reputation: 494

Following on what @gary-russell suggested (thanks again for help!), I took slightly different approach in building listeners, to reduce amount of the code, as in my project there is a lot of consumers.

    class KafkaReceiverWithMetrics<K, V>(
        private val receiver: KafkaReceiver<K, V>,
        private val consumerId: String,
        private val metricsListener: MicrometerConsumerListener<K, V>, ) : KafkaReceiver<K, V> by receiver {
        override fun receive(): Flux<ReceiverRecord<K, V>> =
            receiver.receive()
                .doOnSubscribe {
                    receiver
                        .doOnConsumer { consumer -> metricsListener.consumerAdded(consumerId, consumer) }
                        .subscribe()
                } }

And then I just need a single bean for every listener:

    @Bean
    fun someEventReceiver(): SomeEventReceiver =
        KafkaReceiverWithMetrics(
            KafkaReceiver.create(
                ReceiverOptions.create<String, SomeEvent>(kafkaProperties.buildConsumerProperties())
                    .withValueDeserializer(SomeEventDeserializer())
                    .subscription(listOf(topics.someEvent))
            ),
            topics.someEvent,
            MicrometerConsumerListener(meterRegistry)
        )

Upvotes: 1

Gary Russell
Gary Russell

Reputation: 174484

Unlike spring-kafka, reactor-kafka doesn't currently have any Micrometer integration.

If you also have spring-kafka on the class path, you can leverage its MicrometerConsumerListener to bind a KafkaClientMetrics to the meter registry (or you can do the registration binding yourself).

Here is an example using the Spring listener:

@SpringBootApplication
public class So66706766Application {

    public static void main(String[] args) {
        SpringApplication.run(So66706766Application.class, args);
    }

    @Bean
    ApplicationRunner runner(MicrometerConsumerListener<String, String> consumerListener) {
        return args -> {
            ReceiverOptions<String, String> ro = ReceiverOptions.<String, String>create(
                        Map.of(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092",
                                ConsumerConfig.GROUP_ID_CONFIG, "so66706766"))
                    .withKeyDeserializer(new StringDeserializer())
                    .withValueDeserializer(new StringDeserializer())
                    .subscription(Collections.singletonList("so66706766"));
            KafkaReceiver<String, String> receiver = KafkaReceiver.create(ro);
            receiver.receive()
                    .doOnNext(rec -> {
                        System.out.println(rec.value());
                        rec.receiverOffset().acknowledge();
                    })
                    .subscribe();
            receiver.doOnConsumer(consumer -> {
                consumerListener.consumerAdded("myConsumer", consumer);
                return Mono.empty();
            }).subscribe();
        };
    }

    @Bean
    MicrometerConsumerListener<String, String> consumerListener(MeterRegistry registry) {
        return new MicrometerConsumerListener<>(registry);
    }

    @Bean
    NewTopic topic() {
        return TopicBuilder.name("so66706766").partitions(1).replicas(1).build();
    }

}

and

# HELP kafka_consumer_successful_authentication_total The total number of connections with successful authentication
# TYPE kafka_consumer_successful_authentication_total counter
kafka_consumer_successful_authentication_total{client_id="consumer-so66706766-1",kafka_version="2.6.0",spring_id="myConsumer",} 0.0
# HELP jvm_gc_live_data_size_bytes Size of long-lived heap memory pool after reclamation
# TYPE jvm_gc_live_data_size_bytes gauge
jvm_gc_live_data_size_bytes 0.0
# HELP kafka_consumer_connection_creation_rate The number of new connections established per second
# TYPE kafka_consumer_connection_creation_rate gauge
kafka_consumer_connection_creation_rate{client_id="consumer-so66706766-1",kafka_version="2.6.0",spring_id="myConsumer",} 0.07456936193482637
...

I added an issue: https://github.com/reactor/reactor-kafka/issues/206

Upvotes: 4

Related Questions