Reputation: 721
I would like to create an Azure Data Factory pipeline that copies a file to multiple storage accounts. My plan was to define the storage account connection info in a pipeline parameter as an array and use the ForEach activity to loop over each of the objects in the array passing the connection info to another pipeline.
[
{
"destinationBlob": {
"connectionString": "Conn1"
}
},
{
"destinationBlob": {
"connectionString": "Conn2"
}
},
{
"destinationBlob": {
"connectionString": "Conn3"
}
}
]
My question is, is it possible to parameterize the connection to an Azure Blob Storage Linked Service?
Upvotes: 3
Views: 2682
Reputation: 31
This can actually be done. Sample JSON:
{
"name": "DataLakeBlob",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"parameters": {
"StorageAccountEndpoint": {
"type": "String",
"defaultValue": "https://testblobstorage.blob.core.windows.net"
}
},
"type": "AzureBlobStorage",
"typeProperties": {
"serviceEndpoint": "@{linkedService().StorageAccountEndpoint}"
},
"description": "Test Description"
}
}
Upvotes: 3
Reputation: 17126
Edit: This was acknowledged by Microsoft. https://learn.microsoft.com/en-us/azure/data-factory/parameterize-linked-services#
For those looking for SAS token parameterization, you could use following JSON.
Be sure to check the checkbox like in screenshot for the JSON to apply.
{
"type":"Microsoft.DataFactory/factories/linkedservices",
"properties":{
"parameters": {
"StorageAccountEndpoint": {
"type": "String",
"defaultValue": "https://<<yourstorageaccountname>>.blob.core.windows.net/?sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2019-10-20T16:33:57Z&st=2019-09-20T08:33:57Z&spr=https&sig=lDrBjD%2BjM2T1XjRW997VPMqDp99ZxVoReyRK0VEX7zQ%3D"
}
},
"type": "AzureBlobStorage",
"typeProperties": {
"sasUri": "@{linkedService().StorageAccountEndpoint}"
}
}}
Upvotes: 0