Reputation: 27
I am trying to achieve a near-real-time replica (within ~5 minutes ideally) of data between a source system (Azure VM with SQL Server - read only about 100 tables) into an Azure Storage Account (Gen 2, Blob folders) to support various upstream data workloads.
I had considered Azure Data Factory to carry out an initial batch load of the historical data (takes ~40minutes using ADF), followed by an incremental "update" to the sink when source tables change (updates or inserts).
The challenges are:
What are the best possible approaches to establish this synchronization between sink and source?
Upvotes: 0
Views: 155
Reputation: 89361
You could start with Change Data Capture or Change Tracking, then run an SSIS job to write the data into blob storage. Or you could use something like Debezium.
Upvotes: 1