Prashant Rewatkar
Prashant Rewatkar

Reputation: 21

How to read Azure Databricks output using API or class library

I have Azure Databrick notebook which contain SQL command. I need to capture output of SQL command and use in Dot Net core. Need help.

Upvotes: 2

Views: 3312

Answers (1)

SaurabhSharma
SaurabhSharma

Reputation: 936

You cannot capture results of Azure Databricks Notebook directly in Dot Net Core. Also, there are no .NET SDK's available and so you need to rely on Databricks REST API's from your .NET code for all your operations. You could try the following -

  1. Update your Notebook to export result of your SQL Query as CSV file to file store using df.write. For example - df.write.format("com.databricks.spark.csv").option("header","true").save("sqlResults.csv")
  2. You can setup a Job with the above Notebook and then you can invoke the job using Jobs API - run-now in .NET
  3. You need to poll the job status using the runs list method to check the job completion state from your .NET code.
  4. Once the job is completed, you need to use the DBFS API - Read to read the content of the csv file your notebook has generated in step 1.

Upvotes: 2

Related Questions