Shyam
Shyam

Reputation: 95

How to handle exceptions in azure databricks notebooks?

I am new to Azure and Spark and request your help on writing the exception handling code for the below scenario.

I have written HQL scripts (say hql1, hql2, hql3) in 3 different notebooks and calling them all on one master notebook (hql-master) as,

val df_tab1 = runQueryForTable("hql1", spark)
val df_tab2 = runQueryForTable("hql2", spark)

Now I have the output of HQL scripts stored as dataframe and I have to write exception handling on master notebook where if the master notebook has successfully executed all the dataframes (df1_tab, df2_tab), a success status should get inserted into the synapse table job_status.

Else if there was any error/exception during the execution of master notebook/dataframe, then that error message should be captured and a failure status should get inserted into the synapse table.

I already have the INSERT scripts for success/failure message insert. It will be really helpful if you please provide a sample code snippet through which the exception handling part can be achieved. Thank you!!

Upvotes: 1

Views: 12365

Answers (1)

Alex Ott
Alex Ott

Reputation: 87119

basically, it's just a simple try/except code, something like this:

results = {}
were_errors = False
for script_name in ['script1', 'script2', 'script3']:
  try:    
    retValue = dbutils.notebook.run(script_name)
    results[script_name] = retValue
  except Exception as e:
    results[script_name] = "Error: {e}"
    were_errors = True

if were_errors:
  log failure # you can use data from results variable
else:
  log success

Upvotes: 2

Related Questions