Reputation: 834
I have a number of pipeline/linkedservice/dataset json files and I need to upload them to my Data Factory, opposed to creating new versions and copying the text over. Whats the simplest way to do this?
Upvotes: 1
Views: 3680
Reputation: 77
AzRM is not recommended by microsoft anymore.
You can use the updtaed powershell AZ module to achieve this.
I won't repeat something that is already quite self-explanatory in the official documentation here: https://learn.microsoft.com/en-us/powershell/module/az.datafactory/set-azdatafactoryv2pipeline?view=azps-3.3.0
Upvotes: 0
Reputation: 3209
If you are using version 1, you can use Visual Studio to do so as shown here https://azure.microsoft.com/en-us/blog/azure-data-factory-visual-studio-extension-for-authoring-pipelines/
If you are using version 2, you can do this using powershell. First download and install the azure sdk for powershell from here: https://azure.microsoft.com/en-us/downloads/ Then from powershell, login and select subscription:
Login-AzureRmAccount
Select-AzureRmSubscription -SubscriptionName "your subs name here"
Then with the following command you can upload the json files:
Set-AzureRmDataFactoryV2Pipeline -DataFactoryName "your df name" -ResourceGroupName "your RG name" -Name "pipelineName" -DefinitionFile "path to json file"
Replace with your Data factory and resource group name.
The same arguments are used to upload linked services and datasets with the commands:
Set-AzureRmDataFactoryV2LinkedService
Set-AzureRmDataFactoryV2Dataset
Hope this helped!
Upvotes: 1