Nandhakumar Rajendran
Nandhakumar Rajendran

Reputation: 441

Importing ipynb file from another ipynb notebook in azure databricks

I am trying to import ipynb notebook from another notebook in Azure Databricks using

from ipynb.fs.full.test_1 import *

While importing I am getting the following key error

KeyError: 'package'

Here is my test code

class Test1:
  def t1():
      a=10
      b= 10
      c= a+b
      return c
Test1.t1()

Am I missing something?

Upvotes: 1

Views: 637

Answers (1)

Alex Ott
Alex Ott

Reputation: 87279

Notebooks in the Databricks aren't the real files - they are something like an entry in the database not stored on the file system. Because of this you can't use Python's import to code from one notebook into another.

Right now it's possible to use %run to include content of one notebook into another (see docs), for example, to implement testing of notebooks. Just split your code into two pieces:

  1. Notebook with functions that you want to test (name it functions, for example):
def func1(....):
 ....
  1. And in the notebook with test code put the following as a separate cell
%run ./functions

this will include the whole content of the first notebook into context of the second notebook.

I have a demo project that shows how to use this approach to test notebooks on Databricks.

P.S. There is a workaround by downloading the notebooks onto local file system, adding them to sys.path, etc., but it's cumbersome - you can find an example in the following answer.

Upvotes: 1

Related Questions