Reputation: 63
Data factory Copy activity fails when copy the delta table from databricks to storage account gen2
Details
ErrorCode=AzureDatabricksCommandError,Hit an error when running the command in Azure Databricks. Error details: Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key Caused by: Invalid configuration value detected for fs.azure.account.key.
Appreciate your help.
Upvotes: 1
Views: 11312
Reputation: 31
I think so you can pass the secret as below:
spark.hadoop.fs.azure.account.key.<storage_account_name>.blob.core.windows.net {{secrets/<secret-scope-name>/<secret-name>}}
Upvotes: 1
Reputation: 63
Edit the cluster, fs.azure.account.key..dfs.core.windows.net{{secrets//}}
Its working fine now...Able to copy data from delta lake table to adls gen2
Upvotes: -1
Reputation: 2729
The above error mainly happens because the staging is not enabled. We need to enable staging to copy data from delta Lake.
Go to Azure Databricks inside cluster -> advance option and edit spark config as per the below format.
spark.hadoop.fs.azure.account.key.<storage_account_name>.blob.core.windows.net <Access Key>
After that you can follow this official document it has detail explanation about copy activity with delta lake.
you can refer this Article by RishShah-4592
Upvotes: 1