m1nkeh
m1nkeh

Reputation: 1397

Azure Data Factory Slices

There are lots of demos online where slicing is carried out on blobs of the myblobcontainer/{Year}/{Month}/{Day} format or similar (i.e. https://azure.microsoft.com/en-gb/documentation/articles/data-factory-scheduling-and-execution/).

Now clearly this will allow for really easy slicing of the data as the parameters for year month and day have really clearly been defined.

What I have though, are files that more like this:

myblobcontainer/log_20151231_144229.csv

which is clearly YYYYMMDD_HHMMSS.

I want to process my files hourly, not re-process anything, and ideally not have to mess around too much restructuring my blobs.

Does anyone have any idea how I can "read" only these files that fall inside my hour slices?

Upvotes: 0

Views: 632

Answers (1)

Yingqin
Yingqin

Reputation: 205

The folderPath in Blob Dataset can work as a path prefix. So you could set the folderPath like "$$Text.Format('myblobcontainer/log_{0:yyyyMMdd}', WindowStart)", and all files satisfy the prefix will be copied.

Upvotes: 1

Related Questions