Valkyrja.Kara
Valkyrja.Kara

Reputation: 41

Databricks: Cannot access mounted blob

I mounted a blob storage container (confirmed blob type):

dbutils.fs.mount(
  source = "wasbs://[email protected]",
  mount_point = "/mnt/mymountpoint",
  extra_configs = {"fs.azure.sas.mycontainer.myblobstorageaccount.blob.core.windows.net":dbutils.secrets.get(scope = "mydatabricksscope", key = "nameofmysecretinkeyvault")})

The mount point is created and I can see it in databricks. I placed a csv file in this container as well as a file that contains further csv files but cannot access anything:

dbutils.fs.ls("/mnt/mymountpoint/")

java.io.FileNotFoundException: / is not found

dbutils.fs.ls("wasbs://[email protected]/")

shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Container ownbackupsalesforcedevemastercnt in account ownbackupsalesforcedev.blob.core.windows.net not found, and we can't create it using anoynomous credentials, and no credentials found for them in the configuration.

df = spark.read.format('csv').load('/mnt/mymountpoint/mycsv.csv', header="true")

AnalysisException: Path does not exist: dbfs:/mnt/ownbackupsalesforcemnt/accounts.csv

I've redone the secret scope in databricks and unmounted and remounted this several times but still cannot get in. Can anyone please help me?

Upvotes: 1

Views: 2130

Answers (1)

Rakesh Govindula
Rakesh Govindula

Reputation: 11294

First try this without the secret scope.

Please follow below process:

  • As you are trying to mount using SAS(Shared access Signature), go to storage and click on Shared access signature in the sidebar.
    Please make sure you check the Service, Container, Object in Allowed resource type.

enter image description here

  • Now click on Generate SAS and copy it and paste in the your code.

    dbutils.fs.mount( source = "wasbs://[email protected]", mount_point = "/mnt/mymountpoint", extra_configs = {"fs.azure.sas.mycontainer.myblobstorageaccount.blob.core.windows.net": "SAS")})

  • Please make sure you remove ? character in the SAS before adding it to the code.

Please check these outputs as per my repro:

enter image description here

enter image description here

Also, instead you can try with the account key in storage account to mount.

enter image description here

Syntax:

dbutils.fs.mount(source = "wasbs://<containername>@<storageaccountname>.blob.core.windows.net",
                 mount_point = "/mnt/<mountname>",
                 extra_configs = {'fs.azure.account.key.<container name>.<account name>.blob.core.windows.net': ‘account key’}
)

In case if you still face same issue after using it with secret scope, may be the issue is with the Secret scope.

Please refer this Microsoft Documentation to learn about the Secret scope.

Reference:

https://learn.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-storage#mount-azure-blob

Upvotes: 1

Related Questions