user2227105
user2227105

Reputation: 11

How to replace Data Export Service with Synapse Link for Dataverse

We are heavily reliant on the DES as it exists today and have been monitoring developments for the Synapse Link over the past year since the DES deprecation announcement. To date, we have not found a comprehensive guide on how to leverage the Synapse link and pipelines to exactly replicate the capabilities of DES – a service that results in a SQL Server database (serverless / dedicated pools are not sufficiently performant by comparison) that is up to 15 min delayed replication of the dataverse entities configured for export, including the OptionsetMetadata entities. We have considered and tested all of the approaches we can find through the following links but NONE are a true replacements for the DES:

https://powerapps.microsoft.com/en-us/blog/do-more-with-data-from-data-export-service-to-azure-synapse-link-for-dataverse/

https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-incremental-updates

https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines?tabs=synapse-analytics

https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-synapse#access-near-real-time-data-and-read-only-snapshot-data

https://powerapps.microsoft.com/en-us/blog/do-more-with-data-from-data-export-service-to-azure-synapse-link-for-dataverse/

Is Synapse Link for Dataverse even capable of replacing Data Export Service. Is there some secret sauce I'm missing here? Microsoft docs seem to only offer limited use cases / preview functionality and not focus on what would be a like-for-like replacement of the existing service.

Upvotes: 0

Views: 1217

Answers (2)

KalleK
KalleK

Reputation: 1

Serverless SQL seems to be way too slow, qyery performance is like scanning text files... ;) Building another pipeline which pushes data to dedicated SQL pool adds another complex layer and even more delay.

It's also very hard to limit access to Serverless SQL pool as disabling public internet access seems to break Data Verse link and "Allow Azure services and resources to access this workspace" option does not help.

Really expect Microsoft to invest more in this functionality, looks like 3rd parties are joining the party: https://projectum.com/power-hub/

Upvotes: 0

Iulian Apostol
Iulian Apostol

Reputation: 11

I understand you so well , because we are currently facing the exact same problem.

We have successfully used Synapse Link to export Dataverse data to a Data Lake Storage. Then we effortfully tried to use Data pipelines to sink all the data in an Azure SQL Database, using various data mapping flows, but this al turned to be expensive and an overkill as this would be used for Big Data not for simple export service. As I am not a Data engineer, it was pretty difficult to find the right solution.

Based on our experience, I can tell you we found 2 possible solution that replicate the former DES which gave us no headache.

  1. Use Azure Data Factory, create linked service for Dataverse, thereafter a linked service for SQL Database, and after that, create a simple Copy Activity in ADF pipeline, configure the activity with both linked services, and the SQL will be updated frequently when new data in Dataverse emerge. Please study the following links: https://learn.microsoft.com/en-us/azure/data-factory/connector-dynamics-crm-office-365?tabs=data-factory as well as https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-dot-net (Just study the Copy data activity feature, not the entire flow)
  2. Metada-driven copy. We have not yet implemented this solution but it remains as a backup plan. (https://learn.microsoft.com/en-us/azure/data-factory/copy-data-tool-metadata-driven) As I understand this can be used to copy data from a Data lake to SQL with no major problems, but I have yet to try this solution.

Hope this helps!

Upvotes: 1

Related Questions