Debbie Edwards
Debbie Edwards

Reputation: 1

Data Factory V2 Error Code 2200 on Data Lake Sink

Im using the following as a workshop https://www.youtube.com/watch? v=IAqJ6nCDtGc

I have some sales data in an Azure SQL Database (Rather than an on premise database)

And I have a data lake Gen Storage 1 I've successfully set up the Connectors and Data sets. Each Connector has tested ok

Do create the Azure Data Lake created an app registration to get the information I needed for the principal ID and the Principal Key

Ive created the pipeline with a copy activity as per the above video

When I run it I get the following

{ "errorCode": "2200", "message": "Failure happened on 'Sink' side. ErrorCode=UserErrorFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The request to 'Unknown' failed and the status code is 'BadRequest', request id is ''. \r\nBad Request\r\n\r\n

Bad Request - Invalid URL

\r\n

HTTP Error 400. The request URL is invalid.

\r\n\r\n ,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (400) Bad Request.,Source=System,'", "failureType": "UserError", "target": "Copy Sales data to data lake" }

Ive checked the URL in the data lake connector and this seems fine

"dataLakeStoreUri": "https://.azuredatalakestore.net/webhdfs/v1",

The only other URL I can think of is the one set up when registering the app on sign on URL

https://learn.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal

If anyone can help it would be greatly appreciated.

Debbie

Upvotes: 0

Views: 5400

Answers (2)

Debbie Edwards
Debbie Edwards

Reputation: 1

found out what the issue was.

I had set up a dynamic file path for the data Lake so the file would be placed into Year/Month/Day folders but this had caused the above issue. Once I deleted this path and just ran it into the data lake it worked.

The file path I used was from this u-tube how to guide

uhttps://www.youtube.com/watch?v=IAqJ6nCDtGc

I still need to figure out how to do the above but at the least I can get my file into the data lake

Upvotes: 0

MarkD
MarkD

Reputation: 1711

A couple of thoughts.

Can you check you have the account uri set to "https://<your ADL endpoint name>.azuredatalakestore.net/webhdfs/v1"? Above you have ".azuredatalakestore.net/webhdfs/v1" but understand you may have deleted the endpoint in this post for privacy.

On the ADL permissions, have you assigned permissions through the data explorer in ADL? The service principal I believe needs execute permissions on the root folder and all the folders in the required path plus read/write on any subfolder that will be written to. i.e. if I am writing to \foo\bar.txt

\       permissions needed = x
foo     permissions needed = x
bar.txt permissions needed = rw

See here for details.

Finally, folders are case sensitive so check that they are being referenced correctly.

Hope this helps. Mark.

Upvotes: 1

Related Questions