Reputation: 27
I want to access one Databricks environment delta tables from other Databricks environment by creating global Hive meta store in one of the Databricks. Let me know if it is possible or not.
Thanks in advance.
Upvotes: 2
Views: 8143
Reputation: 87174
There are two aspects here:
dataframe.write.format("delta").save("some_path_on_adls")
, you can read these data from another workspace that has access to that shared workspace - this could be done eitherspark.read.format("delta").load("some_path_on_adls")
delta.`some_path_on_adls`
dataframe.write.format("delta").option("path", "some_path_on_adls")\
.saveAsTable("db_name.table_name")
and in another workspace execute following SQL (either via %sql
in notebook or via spark.sql
function:
CREATE TABLE db_name.table_name USING DELTA LOCATION 'some_path_on_adls'
dataframe.write.format("delta").option("path", "some_path_on_adls")\
.saveAsTable("db_name.table_name")
you still need to save it into shared location, so the data is accessible from another workspace, but you don't need to register the table explicitly, as another workspace will read the metadata from the same database.
Upvotes: 2