xavier
xavier

Reputation: 177

Error using data factory for copyactivity from blob storage as source

Why do I keep getting this error while using a folder from a blob container as source (which contains only one GZ compressed file) in copy activity in data factory v2 and as sink another blob storage (but I want the file decompressed)?

 "message":"ErrorCode=UserErrorFormatIsRequired,
'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
Message=Format setting is required for file based store(s) in this scenario.,Source=Microsoft.DataTransfer.ClientLibrary,'",

I know it means I need to specify explicitly the format for my sink dataset, but I am not sure how to do that.

Upvotes: 1

Views: 1801

Answers (3)

Leon Yue
Leon Yue

Reputation: 16401

According you comment, I tried a lot times, unless you choose the compressed file as source dataset and import the schemas, Azure Data factory copy actives will not help you decompress the file.

If the files in the the compressed file don't have the same schema, the copy active also could be failed.

Hope this helps.

Upvotes: 1

Atvoid
Atvoid

Reputation: 187

I suggest using the copy data tool.

step 1

copy tool

step 2 binary copy

Upvotes: 2

Martin Esteban Zurita
Martin Esteban Zurita

Reputation: 3209

The easiest way to do this: go to the dataset, and click on the tab Schema, then Import Schema.

enter image description here

Hope this helped!!

Upvotes: 0

Related Questions