Reputation: 11
I researched about the use of Golden gate, confluent and JDBC, "poor man's CDC with flashback query"
What is the most effective way to load the Oracle data on Kafka producer?
Upvotes: 0
Views: 662
Reputation: 191733
Flashback Query and Kafka Connect JDBC would be equivalent options. You're running a query and sending events into Kafka, not necessarily knowing what's altered in the database, but getting all ResultSets placed into the topic.
GoldenGate, Attunity, or work being done in the Debezium project allow you to capture every event, as they occur, capturing even database schema modifications in near real-time, including DELETE and historical changes of UPDATEs and INSERT OVERWRITE.
CDC based solutions also won't tax the database as its not opening a network connection repeatedly to startup the query engine
Streamsets is another option
Other questions Oracle replication data using Apache kafka
Upvotes: 2