Marco
Marco

Reputation: 560

Flink - Integration Testing Table API?

I have built a very small and straight forward Flink app which consumes events from Kafka (json) deserialize them to a Java object and then create two Table and uses the Table API to to some simple operation and finally join the two table and write the result back to a Kafka

What are the best practices to test such code? How do I go about to write Integration Test that verify that the code written with Table API produces the right result?

(Using Flink 1.8.3)

Upvotes: 1

Views: 532

Answers (1)

Jark Wu
Jark Wu

Reputation: 176

We added an integration test for Kafka SQL connector since 1.10 in KafkaTableITCase. It creates a kafka table and writes some data into it (using json format), and read it again and applies a window aggreation, finally checks the window results using TestingSinkFunction. You can check the code here:

https://github.com/apache/flink/blob/release-1.10/flink-connectors/flink-connector-kafka-base/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaTableTestBase.java

Upvotes: 1

Related Questions