jKraut
jKraut

Reputation: 2487

Connect to Blob storage "no credentials found for them in the configuration"

I'm working with Databricks notebook backed by spark cluster. Having trouble trying to connect to the Azure blob storage. I used this link and tried the section Access Azure Blob Storage Directly - Set up an account access key. I get no errors here:

spark.conf.set(
  "fs.azure.account.key.<your-storage-account-name>.blob.core.windows.net",
  "<your-storage-account-access-key>")

But receive errors when I try and do an 'ls' on the directory:

dbutils.fs.ls("wasbs://<your-container-name>@<your-storage-account-name>.blob.core.windows.net/<your-directory-name>")

shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Unable to access container <container name> in account <storage account name>core.windows.net using anonymous credentials, and no credentials found for them in the configuration.

If there is a better way, please provide suggestion as well. thanks

Upvotes: 4

Views: 6189

Answers (1)

Ayush Bijawat
Ayush Bijawat

Reputation: 31

  1. You need to pass the storage account name and key while setting up the configuration . You can find this from azure portal.
spark.conf.set(
     "fs.azure.account.key.<your-storage-account-name>.blob.core.windows.net",
     "<your-storage-account-access-key>")
  1. Also while doing the ls you need to add Container name and directory name.
dbutils.fs.ls("wasbs://<your-container-name>@<your-storage-account-name>.blob.core.windows.net/<your-directory-name>")

Hope this will resolve your issue!

Upvotes: 3

Related Questions