jgtrz
jgtrz

Reputation: 375

Databricks SQL Server connection across multiple notebooks

I found some resources for how to pass variables across pySpark databricks notebooks. I'm curious if we can pass SQL Server connection, such as having host/database/port/user/pw in Notebook A and calling the connection on Notebook B.

Upvotes: 0

Views: 573

Answers (1)

Rayan Ral
Rayan Ral

Reputation: 1859

Take a look at that part of Databricks documentation: https://docs.databricks.com/notebooks/notebook-workflows.html#pass-structured-data. This way you can pass strings, one or multiple, across notebooks, but you'll have to create the connection in Notebook B manually.

Other option - create Notebook A, that creates a connection variable, and "run" it before executing some code in Notebook B (more details here - https://forums.databricks.com/questions/154/can-i-run-one-notebook-from-another-notebook.html). Basically, you need a cell with code:

%run path/to/notebookA

Upvotes: 2

Related Questions