Greencolor
Greencolor

Reputation: 695

Read the file from datalake in databricks -No such file directory error

I have a problem with reading the file from ADLS gen 2.

I have dont the mounting properly as after executing dbutils.fs.ls('/mnt/bronze') I can see the file path.

the way how I did the mounting:

# dbutils.fs.mount(
#   source = "abfss://"+container_read+"@"+storage_account_name_read+".dfs.core.windows.net/",
#   mount_point = "/mnt/dataverse",
#   extra_configs = configs)

the way how I read the file:

with open('/mnt/dataverse/model.json', 'r') as f:
    data = f.read()
    manifest = json.loads(data)

and it throws the error No such file or directory

Sketchy part is that I can read the file usign different cluster ( runtime 11.3.x-scala2.12) but after switching to 12.2 cluster I cant read it.

Any idea how can I fix this?

Upvotes: 1

Views: 1666

Answers (1)

Rakesh Govindula
Rakesh Govindula

Reputation: 11514

I created the mount point data and got same error with your code.

enter image description here

Here, with open() not identifying the mount point might be reason for the reason. We need to give the dbfs path to this. To resolve the error, give the path like this.

/dbfs/mnt/<mount-point>/<filename>

To understand better about this path, Go to dbfs browser and click on file name in the mount point. You can see the File API format file path.

enter image description here

You can see that I am able to read the file, with the above file path.

enter image description here

If the issue still persists, it's better to raise a support ticket on this.

Upvotes: 1

Related Questions