nam
nam

Reputation: 23749

Apache Spark Connector - where to install on Databricks

This Apache Spark connector: SQL Server & Azure SQL article from Azure team describes how to use this connector.

Question: If you want to use the above connector in Azure Databricks, where will you install it?

Remarks: The above article tells you to install it from here and import it in, say, your notebook using com.microsoft.azure:spark-mssql-connector_2.12:1.2.0. But it does not tell you where to install. I'm probably not understanding the article correctly. I need to use it in an Azure Databricks and would like to know where to install the connector jar (compiled) file.

Upvotes: 1

Views: 1394

Answers (2)

Aku
Aku

Reputation: 802

See this documentation to install in UI:

https://learn.microsoft.com/en-us/azure/databricks/libraries/cluster-libraries#cluster

or follow these steps:

1.Click compute icon Compute in the sidebar.
2.Click a cluster name.
3.Click the Libraries tab.
4.Click Install New.
5.The Install library dialog displays.
6.Select one of the Library Source options, complete the instructions
that appear, and then click Install.

Upvotes: -1

Peter Dowdy
Peter Dowdy

Reputation: 439

You can do this in the cluster setup. See this documentation: https://databricks.com/blog/2015/07/28/using-3rd-party-libraries-in-databricks-apache-spark-packages-and-maven-libraries.html

In short, when setting up the cluster, you can add third party libraries by their Maven coordinates - "com.microsoft.azure:spark-mssql-connector_2.12:1.2.0" is an example of a Maven coordinate.

Upvotes: 3

Related Questions