Reputation: 1108
I am trying to implement azure DevOps on databricks notebook.
My Dev Instance databricks notebooks are integrated with git repository and it is in below folder structure.
I have created a build pipeline which will detect the changes in the Databricks folder for each Code (CodeA and CodeB) using Trigger tab in the build pipeline as shown below.
But at the time of publishing artifacts, how could we select the path to get the databricks files only from each of the Code as shown in the above folder structure?
If it is not possible if i have to select the parent folder Code which includes databricks file for CodeA and CodeB then how can I deploy it into the Shared folder of Databricks UAT instance which is having the below folder structure?
Ideally it should be as shown in the below diagram.
Any way to achieve this? Any leads appreciated.
Upvotes: 0
Views: 1720
Reputation: 30333
You can just select the parent folder Code/
which includes databricks file for CodeA and CodeB to publish in the build pipeline.
Then you need to create a release pipeline and use the third-party task Databricks Deploy Notebooks to deploy the notebooks.
When you create the release pipeline, Click add to select your build pipeline and add the artifacts
Add a stage in the release pipeline. Add the task Databricks Deploy Notebooks
in the stage job.
Click the 3dots
of the Source files path
field to select the databricks. Enter the Target files path
of your azure databricks.
Here you can select the path to get each databricks file deployed to its corresponding folder in azure databricks. See below.
Then configure the Authentication method. See document here to get a databricks bearer token
for the task.
Add Multiple Databricks Deploy Notebooks
tasks and change the Source files path
and Target files path
field accordingly to deploy to different databricks.
You can check this tutorial for more information.
Upvotes: 2