Reputation: 179
I've run into a release pipeline issue with the Azure Data Factory. In most types of Azure entities (such as a web app or sql project), I can develop the project in Visual Studio and use ADO to create a build and releases to deploy the project to Azure. Using variables and libraries, I can create the appropriate release definitions variables on a per-environment basis, and ensure that the build I deploy is the same for each step of the release pipeline (i.e dev -> tst -> stg -> prd). You know, the normal continuous integration/continuous deployment process?
However with Azure Data factories, it seems that I have to create the data factory in the Azure Portal directly for an environment and then create it again for another environment in the pipeline (I know I can export and reimport manually).
What I would like is to be able to create an Azure Data Factory project in Visual Studio (2019), maintain it in Visual Studio with a similar designer like the one in the Azure portal, check it into git and deploy it with a build and release in ADO.
Since creating an Azure Data Factory project doesn't seem possible (or am I missing something?), what is the recommended approach to working with Azure data factories in a continuous integration/continuous deployment ADO environment?
Upvotes: 6
Views: 8217
Reputation: 44
ADFV2 does not have a plugin for Visual Studio, most of the effort has been on the UX side. We recommend you to use ADF UI as your development tool, where you can define your workflow easily and validate your changes.
For CICD, you can integrate your Dev factory with GIT, and then setup CICD in the below way https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment.
Every publish to Dev factory will trigger your release pipeline which can take the build and deploy to remaining stages.
Upvotes: 2