Thomas
Thomas

Reputation: 1219

Test kafka and flink integration flow

I would like to test Kafka / Flink integration with FlinkKafkaConsumer011 and FlinkKafkaProducer011 for example.

The process will be :

  1. read from kafka topic with Flink
  2. some manipulation with Flink
  3. write into another kafka topic with Flink

With a string example it will be, read string from input topic, convert to uppercase, write into a new topic.

The question is how to test the flow ?

When I say test this is Unit/Integration test.

Thanks!

Upvotes: 3

Views: 5099

Answers (1)

Nizar
Nizar

Reputation: 436

Flink documentation has a little doc on how you can write unit\integration tests for your transformation operators: link. The doc also has a little section about testing checkpointing and state handling, and about using AbstractStreamOperatorTestHarness.

However, I think you are more interested in end-to-end integration testing (including testing sources and sinks). For that, you can start a Flink mini cluster. Here is a link to an example code that starts a Flink mini cluster: link.

You can also launch a Kafka Broker within a JVM and use it for your testing purposes. Flink's Kafka connector does that for integration tests. Here is a sample code starting the Kafka server: link.

If you are running locally, you can use a simple generator app to generate messages for your source Kafka Topic (there are many available. You can generate messages continuously or based on different configured interval). Here is an example on how you can set Flink's job global parameters when running locally: Kafka010Example.

Another alternative is to create an integration environment (vs. production) to run your end-to-end testing. You will be able to get a real feel of how your program will behave in a production-like environment. It is always advised to have a complete parallel testing environment - including a test source\sink Kafka topics.

Upvotes: 5

Related Questions