AmitG
AmitG

Reputation: 609

Databricks SQL workspace configuration with External MetaStore

I have Azure Databricks setup with External hive metaStore (with Azure SQL) and the database connection URL is setup in Databricks Cluster's advanced Settings. In this way I am able to see/access the database tables from delta lake (which is on azure storage account adls) in Databricks's Data section.

Now, I want my users to access these tables through Databricks's 'SQL workspace'. I have configured the 'Data access' using service principle in 'SQL warehouse' section. Per databricks SQL documentation, I am supposed to see the delta lake tables which I can see through 'Data Science and Engineering' section. But I cant see schema or tables in meta store.

Problem, I am not able to see the tables through the 'SQL workspace' > Data. And I am puzzled how it will know where is my External metaStore and what are schema definition? In SQL workspace should be setup to indicate the hive metastore connection. But I am not sure as databricks documentation is not very clear on this point. Please suggest

Below are 'data access' details for service principal:

spark.hadoop.fs.azure.account.auth.type.<adlsContainer>.dfs.core.windows.net OAuth
spark.hadoop.fs.azure.account.oauth.provider.type.<adlsContainer>.dfs.core.windows.net org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
spark.hadoop.fs.azure.account.oauth2.client.id.<adlsContainer>.dfs.core.windows.net <CLIENT_ID>
spark.hadoop.fs.azure.account.oauth2.client.secret.<adlsContainer>.dfs.core.windows.net <CLIENT_SECRET>
spark.hadoop.fs.azure.account.oauth2.client.endpoint.<adlsContainer>.dfs.core.windows.net https://login.microsoftonline.com/<TENANT_ID>/oauth2/token

Upvotes: 1

Views: 1037

Answers (1)

Alex Ott
Alex Ott

Reputation: 87259

As mentioned in the data access documentation you need to configure the same Spark configuration properties for external hive metastore as for "normal" Spark clusters - see documentation on this topic.

Upvotes: 0

Related Questions