Reputation: 33
I am currently working on a project to extend the functionalities of SQL to support more stream computing features based on Apache Flink.
After doing extensive search, I found that Calcite is a great tool to help me parse, validate and optimize those SQL queries, but the streaming support of Calcite is still immature so I have to improve it to suit my needs.
Hence, I would like to know if there is a way to add custom clauses like
CREATE TABLE my_table (
id bigint,
user varchar(20)
) PARAMS (
connector 'kafka',
topic 'my_topic'
)
which uses PARAMS to define how to receive data from a Kafka connector and treat it like a dynamic table as a data source to Flink.
Since there is so little information about this, I would greatly appreciate it if someone of you could provide some hints.
Thank you : )
Upvotes: 3
Views: 1196
Reputation: 18987
Until the latest release (1.15.0, 11th Dec. 2017), Apache Calcite did not support DDL statements, such as CREATE TABLE
or DROP TABLE
. The reason was that
SELECT and DML are standardized, but DDL tends to be database-specific, so our policy is that you make DDL extensions outside of Calcite.
(see Calcite dev mailing list).
With Calcite 1.15.0, the community added basic support for DDL statements. The feature was implemented as an optional module and shows how to customize DDL statements (see documentation). So, it is still expected that systems that use Calcite customize the parser and DDL syntax to their needs.
Upvotes: 4