tommyg
tommyg

Reputation: 66

Azure Data Factory V2 - Copy Task fails HTTP file to Azure Blob Store

In my code I have an HTTP file activity in the "Source" of a copy task. This hits an Azure Function HTTP endpoint and returns a String when complete. I want to store that String result into a "Sink" of Azure Blob.

My Linked Service looks like so.

Linked Service

My Dataset looks like so.

enter image description here

I get the following error when debugging

"{ "errorCode": "2200", "message": "Failure happened on 'Sink' side. ErrorCode=UserErrorInvalidHttpRequestHeaderFormat,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to set addtional http header,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.ArgumentException,Message=Specified value has invalid HTTP Header characters.\r\nParameter name: name,Source=System,'", "failureType": "UserError", "target": "Copy Data1" }"

Upvotes: 2

Views: 3692

Answers (2)

Mark Z.
Mark Z.

Reputation: 2447

It's yet another unfortunate ADF/Synapse Pipelines quirk. Unlike for Web activities, or REST Datasets, here we have a single input for headers, as opposed to header/value input rows in those other tools.

After fiddling around with various combinations, the ONLY one that worked when using DYNAMIC input was the following:

@{string('Authorization: Bearer justanexample')}

Upvotes: 1

Wang Zhang
Wang Zhang

Reputation: 327

according to the error message, the problem lies on the requestHeader setting. Please note that the format of the requestHeader in HTTP dataset should be like "key1:value1\nkey2:value2\nkey3:value3", so in your case, pass "Content-Type": "application/json" to requestHeader should be the right format. Thanks.

Upvotes: 2

Related Questions