lxohi
lxohi

Reputation: 350

Is it possible for Flink to run continuous query over non-dynamic tables?

I'm hoping to use Flink SQL as a materialized view over multiple different services. For example, reading data from MySQL & Redis & RPC services, join them together, then updates result table stored in PostgreSQL

All of these services above have notification writes to Kafka regarding what has changed.

If the "notification" contains all the information in the source tables, then I can just use dynamic tables. However, data in source tables are too large that will be such a waste to store all of them in dynamic tables.

So the best way to do this would be:

  1. Flink reads the notifications in the input kafka topic
  2. Instead of updating & querying dynamic tables from memory, Flink queries external services for the data needed to re-compute the result of the SQL.
  3. Write the result to a sink (like PostgreSQL).

Is it possible to do this with some tricks?

Upvotes: 1

Views: 555

Answers (1)

BenoitParis
BenoitParis

Reputation: 3184

Seems like a job for SYSTEM TIME AS OF / LookupTableSource; which is available for JDBC, but not for redis or RPC. That should not be too difficult to implement, though.

Upvotes: 1

Related Questions