jomccr
jomccr

Reputation: 1

How to ingest CDC events produced by Oracle CDC Source Connector into Snowflake

Our current pipeline is following a structure similar to the one outlined here except we are pulling events from Oracle and pushing them to snowflake. The flow goes something like this:

In the end I have a table of record_metadata, and record_content fields that contain the raw kafka messages.

I'm having to build a set of procedures that handle the merge/upsert logic operating on a stream on top of the raw table. The tables I'm trying to replicate in snowflake are very wide and there are around 100 of them, so writing the SQL merge statements by hand is unfeasible.

Is there a better way to ingest the Kafka topic containing all of the CDC events generated from the Oracle connector straight into Snowflake, handling auto-creating nonexistent tables, auto-updating/deleting/etc as events come across the stream?

Upvotes: 0

Views: 539

Answers (1)

Karthi Keya
Karthi Keya

Reputation: 1

You have confluent Snowflake connections that simply provide the JSON data; implementing a Snowflake stored procedure to create the tables or merging the tables is a possibility, and the materialized views are often built on the JSON structure.

Upvotes: 0

Related Questions