Ricardo Francisco
Ricardo Francisco

Reputation: 11

Is it possible to install a Databricks notebook into a cluster similarly to a library?

I want to be able to have the outputs/functions/definitions of a notebook available to be used by other notebooks in the same cluster without always have run the original one over and over...


For instance, i want to avoid:

definitions_file: has multiple commands, functions etc...

notebook_1

#invoking definitions file
%run ../../0_utilities/definitions_file

notebook_2

#invoking definitions file
%run ../../0_utilities/definitions_file

.....

Therefore i want that definitions_file is available for all other notebooks running in the same cluster.

I am using azure databricks.

Thank you!

Upvotes: 1

Views: 266

Answers (1)

Alex Ott
Alex Ott

Reputation: 87174

No, there is no such thing as "shared notebook" that is implicitly imported. The closest thing you can do is to package your code as a Python library or into Python file inside Repos, but you still will need to write from my_cool_package import * in all notebooks.

Upvotes: 1

Related Questions