WibbleWobble
WibbleWobble

Reputation: 23

Using Azure Data Factory to read only one file from blob storage and load into a DB

I'd like to read just one file from a blob storage container and load it into a copy operation into a DB, after the arrival of the file has set off a trigger.

Using Microsoft Documentation, the closest I seem to do is read all the file in order of Modified Date.

Would anyone out there know how to read one file after it has arrived in my blob storage?

EDIT: Just to clarify, I would look to read only the latest file automatically. Without hardcoding the filename.

Upvotes: 1

Views: 1510

Answers (1)

Joel Cochran
Joel Cochran

Reputation: 7768

You can specify a single Blob in the DataSet. This value can be hard coded or variables (using Data Set Parameters):

enter image description here

enter image description here

If you need to run this process whenever a new blob is created/updated, you can use the Event Trigger:

enter image description here

EDIT:

Based on your addition of "only the latest", I don't have a direct solution. Normally, you could use Lookup or GetMetadata activities, but neither they nor the expression language support sorting or ordering. One option would be to use an Azure Function to determine the file to process.

However - if you think about the Event Trigger I mention above, every time it fires the file (blob) is the most recent one in the folder. If you want to coalesce this across a certain period of time, something like this might work:

  1. Logic App 1 on event trigger: store the blob name in a log [blob, SQL, whatever works for you].
  2. Logic App 2 OR ADF pipeline on recurrence trigger: read the log to grab the "most recent" blob name.

Upvotes: 1

Related Questions