Rufus
Rufus

Reputation: 378

Install wheel package on Unity Catalog shared cluster from ADLS

I try to install whl library stored in Azure Data Lake Storage on Databricks clusters with Databricks Terraform provider. I can't mount container first, because Clusters are in shared mode for Unity Catalog.

ADLS connection details are properly configured in spark_conf part of cluster configuration.

How can I configure databricks_library resource to properly install custom library? Is this method supported by Databricks Terraform provider?

Documentation I used: (https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/library)

I tried to use following configuration (this configuration is dedicated for DBFS, but in provider's documentation there is no example of using unmounted ADLS):

resource "databricks_library" "databricks_adls_lib" {
  cluster_id = module.<cluster_resource>.cluster_id
  whl        = "abfss://<container>@<ADLS_name>.dfs.core.windows.net/<wheel_library>.whl"
}

Error I receive looks like Terraform is trying to mount abfss firstly.

Error: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.refreshMounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore

Manual library installation in Databricks workspace works without any issues.

Upvotes: 0

Views: 238

Answers (1)

Rufus
Rufus

Reputation: 378

Method works properly. I haven't cleared tfstate from previous incorrect checks and that's why it resulted in errors.

Upvotes: 0

Related Questions