user1393689
user1393689

Reputation: 61

Synapse pipeline - Blob storage event trigger - Pipeline failing with Microsoft.DataTransfer.Common.Shared.HybridDeliveryException

I have a copy activity on storage event trigger. The pipeline gets triggered by a blob getting added to storage (ADLS Gen 2). However, the pipeline's copy activity fails with below error after being run by storage trigger. Pipeline runs successfully using Run/Debug, but fails on StorageEventTrigger (and ManualTrigger).

Operation on target Copy failed: ErrorCode=UnsupportedDataStoreEndpoint,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The data store endpoint is not supported in 'AzureBlobFS' connector. Error message : 'The domain of this endpoint is not in allow list. Original endpoint: '::redacted::.blob.core.windows.net'',Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.DataTransfer.SecurityValidation.Exceptions.UrlValidationException,Message=The domain of this endpoint is not in allow list. Original endpoint: '::redacted::.blob.core.windows.net',Source=Microsoft.DataTransfer.SecurityValidation,'

Trigger payload:

{
     "topic": "/subscriptions/<redacted>/resourceGroups/<redacted>/providers/Microsoft.Storage/storageAccounts/<redacted>",
     "subject": "/blobServices/default/containers/<redacted>/blobs/<redacted>",
     "eventType": "Microsoft.Storage.BlobCreated",
     "id": "<redacted>",
     "data": {
         "api": "PutBlob",
         "clientRequestId": "<redacted>",
         "requestId": "<redacted>",
         "eTag": "<redacted>",
         "contentType": "text/plain",
         "contentLength": 214,
         "blobType": "BlockBlob",
         "blobUrl": "https://<redacted>.blob.core.windows.net/<redacted>",
         "url": "https://<redacted>.blob.core.windows.net/<redacted>",
         "sequencer": "<redacted>",
         "identity": "$superuser",
         "storageDiagnostics": {
             "batchId": "<redacted>"
         }
     },
     "dataVersion": "",
     "metadataVersion": "1",
     "eventTime": "2022-10-05T22:31:48.3346541Z"
 }

UPDATE: According to Copy and transform data in Azure Data Lake Storage Gen2 using Azure Data Factory or Azure Synapse Analytics for System-assigned managed identity authentication:

The AzureBlobFS connector must have:

Property-Description-Required

url-Endpoint for Data Lake Storage Gen2 with the pattern of https://.dfs.core.windows.net.-Yes

My trigger payload returns with blob.core.windows.net and seems to interfere with the pipeline activity.

Upvotes: 0

Views: 801

Answers (1)

user1393689
user1393689

Reputation: 61

Resolved by creating a new Linked Service to Blob storage acc. Only difference was using account selection method > From Azure Subscription instead of Enter Manually.

Upvotes: 0

Related Questions