Arun K
Arun K

Reputation: 11

Access the existing Delta lake table data in another Azure Databricks

I am working with Azure Databricks and storing the data in delta lake tables. For some reason, I want to create a new databrick in the same network and access the existing delta lake tables from this newly created databrick.

I tried mounting to the same storage path and while running the query to Create table if not exist am getting the below error:

Error in SQL Statement: Analysis Exception: Cannot create table The associated location ('dbfs:/mnt/[REDACTED]/processed/leads') is not empty but it's not a Delta table

The location contains delta tables and files but not working with the new databricks.

When I try Select * from table it shows

Error in SQL statement: AnalysisException: Table or view not found: leads_enrichment.leads;
'Project [*]

Most of the places it's showing to drop the tables / CREATE OR REPLACE, But I cannot lose the date in delta lake.

Can anyone suggest a way to access the existing delta table and data in the newly created databricks.

Upvotes: 1

Views: 2146

Answers (1)

Rakesh Govindula
Rakesh Govindula

Reputation: 11514

When selecting from delta table(if the location is in storage mount path), use the below syntax.

%sql
select * from delta. `dbfs:/mnt/bricks3data/table1`;

enter image description here

I tried mounting to the same storage path and while running the query

So, you stored your delta tables from one databricks to a Storage account and trying to access those delta tables from another databricks via mount path.

These are my delta tables stored in a Storage account from one databricks.

enter image description here

I have mounted this in another databricks and my mount point here is bricks3data. You can use the below code to store the delta tables in your new databricks.

%python
from delta.tables import *
for i in dbutils.fs.ls("/mnt/bricks3data/"):
    table_path=i.path
    if(DeltaTable.isDeltaTable(spark,table_path)):
        table_name=table_path.split("/")[-2]
        print("Table_name : ",table_name)
        #Create delta table with same name from the mount path
        spark.sql("create table if not exists {name} as select * from delta. `{path}`".format(name=table_name,path=table_path))
        print(table_name," saved as delta table from storage")

enter image description here

Tables stored:

enter image description here

Upvotes: 1

Related Questions