Estrobelai
Estrobelai

Reputation: 309

How do I move a pipeline from an Azure Data Factory V2 to another (same Resource Group)?

What is the easiest way of moving a pipeline across from an Azure Data Factory V2 to another?

Both ADF V2 are in the same resource group.

Upvotes: 3

Views: 8196

Answers (3)

eishbis
eishbis

Reputation: 21

I have found another way of moving pipelines from one ADF to another irrespective of Resource Group.

  1. Hover the mouse over the pipeline name and go to "action menu".
  2. You will see following options in this menu

enter image description here

Open
Clone
Move to
Download Support files
Delete
  1. Click on "Download support files" option available in the action menu. It will allow you to download a zip version of the ADF artifacts linked to this pipeline locally on your system/laptop.

  2. Please note that in case you have a driver pipeline which is implicitly calling other pipelines(one or more) and if you want to export the entire set of these pipelines then u just need to export the support files for the main pipeline which calls other pipelines.

  3. When you open this zip folder locally on ur system or laptop, you will see files in this order

$ ls -lart
total 29
drwxr-xr-x 1 Is 1049089   0 May  6 14:16 ../
drwxr-xr-x 1 Is 1049089   0 May  6 14:16 ./
drwxr-xr-x 1 Is 1049089   0 May  6  2020 trigger/
drwxr-xr-x 1 Is 1049089   0 May  6  2020 pipeline/
drwxr-xr-x 1 Is 1049089   0 May  6  2020 linkedService/
drwxr-xr-x 1 Is 1049089   0 May  6  2020 integrationRuntime/
-rw-r--r-- 1 Is 1049089 260 May  6  2020 info.txt
-rw-r--r-- 1 Is 1049089 739 May  6  2020 diagnostic.json
drwxr-xr-x 1 Is 1049089   0 May  6  2020 dataset/
  1. Next you need to create a local copy of your Git repo or Azure Devops Git repo on your system locally. Now, create a branch from your collaboration branch, say import-pipeline.

  2. When you list the contents of your new branch import-pipeline, you will see the artifacts as below

$ ls -lart
total 37
drwxr-xr-x 1 Is 1049089  0 May  6 14:34 ../
-rw-r--r-- 1 Is 1049089 58 May  6 14:34 README.md
drwxr-xr-x 1 Is 1049089  0 May  6 14:34 ./
drwxr-xr-x 1 Is 1049089  0 May  6 14:36 notebooks/
drwxr-xr-x 1 Is 1049089  0 May  6 14:36 dataset/
drwxr-xr-x 1 Is 1049089  0 May  6 14:36 integrationRuntime/
drwxr-xr-x 1 Is 1049089  0 May  6 14:36 linkedService/
drwxr-xr-x 1 Is 1049089  0 May  6 14:36 pipeline/
drwxr-xr-x 1 Is 1049089  0 May  6 14:36 trigger/
drwxr-xr-x 1 Is 1049089  0 May  6 14:36 .git/
  1. Now, you need to manually copy the artifacts from following folders of zip as mentioned in step 5 to corresponding folders of the import-pipeline branch mentioned in step 7 ''' dataset pipeline trigger linkedSerevice ''' Do not copy the integrationRuntime because the integrationRuntime say self-hosted will vary from project to project.

  2. After importing the artifacts as mentioned in step 8, please change the values access credentials for linked services, key vault urls, secret names if any. The objective here is to make all the linked services should be able to connect successfully and all datasets should be able to browse successfully.

  3. You need to now push the changes of your local branch import-pipeline back to the remote repo. At this stage, you should be able to see the new pipeline and its artifacts in your ADF when you select the branch import-pipeline in Git mode.

  4. Test the newly imported pipeline in debug mode in your ADF. If satisfied, then merge the import-pipeline branch to the collaboration branch and publish your changes to the datafactory.

Upvotes: 2

MarkD
MarkD

Reputation: 1701

If this is a one off move, then export the RM template and import it to the other data factory remembering to change the parameters as appropriate (like the name).

If you have a self hosted Integration Runtime, you'll need to fix the IR reference once it is imported because it will replicate the IR but that IR should be linked to the original or register its own IR.

If you combine Wang's suggestion and have a self hosted IR, then I'd monitor my post here for some issues I am having with that.

M.

Upvotes: 1

Wang Zhang
Wang Zhang

Reputation: 327

Continuous integration & delivery in Azure Data Factory moves pipelines from one environment (development, test, production) to another, which should meet your requirement.

Upvotes: 0

Related Questions