Stark
Stark

Reputation: 634

Accessing SQL Server from the Azure Databricks Notebook

I am using the Azure Databricks notebook and accessing the SQL Server available on Azure plateform.
I am adding a Spark dataframe to a table 'TableName'.
Below is the sample code to connect and save the data into the SQL Server DB from the Azure Databricks notebook:

val jdbc_url = sql-db-connection-string
val user = sql-db-username
val password = sql-db-password
val connectionProperties = new Properties()
connectionProperties.put("user", user)
connectionProperties.put("password", password)  
MyDataFrame.coalesce(1).write.mode("append")jdbc(jdbc_url, "SchemaName.TableName", connectionProperties)

Now the problem:- Though I am able to insert the data into the table but I do not know how does it works internally.
Below is the document published by the Databricks to access SQL Server etc. But no where it is mentioned
1) How does it is establishing the connection and
2) How does it close the connection?
3) If it does not close automatically then what code should I write to close the connection. establish-connectivity-to-sql-server

Pls note: I am using the Scala for the Spark Framework.

Upvotes: 0

Views: 781

Answers (1)

madhu
madhu

Reputation: 1170

1) The connection is established through JDBC. JDBC drivers for Microsoft SQL Server or Azure SQL DB are available within Databricks. We can check this using

Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver")

2) The input/output formats itself will take care of closing the connections. Here jdbc is the format and that takes care of closing the connections. For more information at the code level you can look in to the spark source code.

Upvotes: 0

Related Questions