Susitha Ravinda Senarath
Susitha Ravinda Senarath

Reputation: 1678

Spring kafka unit test listener not subscribing to topic

I have a sample project to explore spring with kafka ( find here ) . I have a listener subscribing to topic my-test-topic-upstream which will just lo the message and key and publish same to another topic my-test-topic-downstream. I tried this is local kafka ( docker-compose file is there ) and it works.

Now I'm tryignt to write a test for this using a embedded kafka server. Under test I have a embedded server starting up ( TestContext.java ) which should start before the test ( overridden junit beforeAll ).

private static EmbeddedKafkaBroker kafka() {
    EmbeddedKafkaBroker kafkaEmbedded =
        new EmbeddedKafkaBroker(
            3,
            false,
            1,
            "my-test-topic-upstream", "my-test-topic-downstream");
    Map<String, String> brokerProperties = new HashMap<>();
    brokerProperties.put("default.replication.factor", "1");
    brokerProperties.put("offsets.topic.replication.factor", "1");
    brokerProperties.put("group.initial.rebalance.delay.ms", "3000");
    kafkaEmbedded.brokerProperties(brokerProperties);
    try {
      kafkaEmbedded.afterPropertiesSet();
    } catch (Exception e) {
      throw new RuntimeException(e);
    }

    return kafkaEmbedded;
  }

Then I create a producer ( TickProducer ) and publish a message to the topic which I expect my listener will be able to consume.

public TickProducer(String brokers) {
    Properties props = new Properties();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, brokers);
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());

    producer = new KafkaProducer<>(props);
  }

  public RecordMetadata publishTick(String brand)
          throws ExecutionException, InterruptedException {
    return publish(TOPIC, brand, Instant.now().toString());
  }

  private RecordMetadata publish(String topic, String key, String value)
      throws ExecutionException, InterruptedException {
    final RecordMetadata recordMetadata;
    recordMetadata = producer.send(new ProducerRecord<>(topic, key, value)).get();
    producer.flush();
    return recordMetadata;
  }

I see following log message keep logging.

11:32:35.745 [main] WARN  o.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-1, groupId=my-test-group] Connection to node -1 could not be established. Broker may not be available.

finally fails with

11:36:52.774 [main] ERROR o.s.boot.SpringApplication - Application run failed
org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata

Any tips here?

Upvotes: 1

Views: 1343

Answers (1)

Gary Russell
Gary Russell

Reputation: 174689

Look at the INFO log ConsumerConfig to see where he is trying to connect (compare it to the ProducerConfig). I suspect you haven't updated the spring boot bootstrap-servers property to point to the embedded broker.

See

/**
 * Set the system property with this name to the list of broker addresses.
 * @param brokerListProperty the brokerListProperty to set
 * @return this broker.
 * @since 2.3
 */
public EmbeddedKafkaBroker brokerListProperty(String brokerListProperty) {
    this.brokerListProperty = brokerListProperty;
    return this;
}

Set it to spring.kafka.bootstrap-servers which will then be used instead of SPRING_EMBEDDED_KAFKA_BROKERS.

BTW, it's generally easier to use the @EmbeddedKafka annotation instead of instantiating the server yourself.

Upvotes: 1

Related Questions