Martin Stoyanov
Martin Stoyanov

Reputation: 51

Azure Data factory Copy data Blob to Cosmos db - need help skipping 2MB files

I have a Azure Data Factory Copy Activity within a pipeline - I'm copying data from Blob container /jsons in multiple virtual folders/ to Cosmos DB. However, fringe cases exist and cannot be escaped, where files larger than 2MB are placed in the Blob storage. When the copy activity picks them, the transfer /and subsequent pipeline activities/ fail as I hit the 2MB hard limit for CosmosDB. I have tried setting up a lookup activity / get metadata but can't seem to address properly the relevant (size) property and the output necessary for the delete activity.

Can anyone advise on a an approach on how to handle this?

Thank you.

Upvotes: 0

Views: 239

Answers (1)

Jay Gong
Jay Gong

Reputation: 23782

It should be possible to get the size of files in Get Metadata activity.But please note it is in bytes and only could be applied on the file.

enter image description here

As i know,no way to avoid 2mb limitation of cosmos db document.You could refer to this case:What is the size limit of a single document stored in Azure Cosmos DB

Upvotes: 1

Related Questions