willy sepulveda
willy sepulveda

Reputation: 149

ADF Copy data from Azure Data Bricks Delta Lake to Azure Sql Server

I'm trying to use the data copy activity to extract information from azure databricks delta lake, but I've noticed that it doesn't pass the information directly from the delta lake to the SQL server I need, but must pass it to an azure blob storage, when running it, it throws the following error

ErrorCode=AzureDatabricksCommandError,Hit an error when running the command in Azure Databricks. Error details: Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key Caused by: Invalid configuration value detected for fs.azure.account.key

Looking for information I found a possible solution but it didn't work.

Invalid configuration value detected for fs.azure.account.key copy activity fails

Does anyone have any idea how the hell to pass information from an azure databricks delta lake table to a table in Sql Server??

These are some images of the structure that I have in ADF:

enter image description here

In the image I get a message that tells me that I must have a Storage Account to continue

These are the configuration images, and execution failed:

Conf: enter image description here

Fail: enter image description here

enter image description here

Thank you very much

Upvotes: 0

Views: 5345

Answers (2)

willy sepulveda
willy sepulveda

Reputation: 149

The solution for this problem was the following:

Correct the way the Storage Access Key configuration was being defined:

in the instruction: spark.hadoop.fs.azure.account.key..blob.core.windows.net

The following change must be made: spark.hadoop.fs.azure.account.key. storageaccountname.dfs.core.windows.net

Upvotes: 1

Pratik Lad
Pratik Lad

Reputation: 8291

Does anyone have any idea how the hell to pass information from an azure databricks delta lake table to a table in Sql Server??

To achieve Above scenario, follow below steps:

First go to your Databricks cluster Edit it and under Advance options >> spark >> spark config Add below code if you are using blob storage.

spark.hadoop.fs.azure.account.key.<storageaccountname>.blob.core.windows.net <Accesskey>
spark.databricks.delta.optimizeWrite.enabled true 
spark.databricks.delta.autoCompact.enabled true 

enter image description here

After that as you are using SQL Database as a sink. Enable staging and give same blob storage account linked service as Staging account linked service give storage path from your blob storage.

enter image description here

And then debug it. make sure you complete Prerequisites from official document.

My sample Input:

enter image description here

Output in SQL:

enter image description here

Upvotes: 1

Related Questions