Reputation: 560
I have built a very small and straight forward Flink app which consumes events from Kafka (json) deserialize them to a Java object and then create two Table and uses the Table API to to some simple operation and finally join the two table and write the result back to a Kafka
What are the best practices to test such code? How do I go about to write Integration Test that verify that the code written with Table API produces the right result?
(Using Flink 1.8.3)
Upvotes: 1
Views: 532
Reputation: 176
We added an integration test for Kafka SQL connector since 1.10 in KafkaTableITCase. It creates a kafka table and writes some data into it (using json format), and read it again and applies a window aggreation, finally checks the window results using TestingSinkFunction. You can check the code here:
Upvotes: 1