Mangesh T.
Mangesh T.

Reputation: 23

How to read the files from Azure Blob Storage with folder structure as 'StartDateOfMonth-EndDatefMonth'?

Scenario

We have azure blob storage container with following folder structure. • 20190601-20190630

Basically, this folder will contain daily CSV files for the given month.

This folder structure is dynamic. So, in the next month, folder 20190701-20190731 will be populated with daily CSV files.

Problem

On daily basis, need to move these files from azure blob storage to azure data lake using azure data factory (v2).

How to specify folder structure (dynamically) in the Input Dataset (Azure Blob Storage) in Azure Data Factory(V2)?

Example: 20190601-20190630/*.CSV for the month June 2019

Basically, StartDateOfMonth and EndDateOfMonth are dynamic.

Thanks in Advance

Upvotes: 1

Views: 319

Answers (1)

Jay Gong
Jay Gong

Reputation: 23792

You could configure your dataset folder path like:

   "folderPath": {
                        "value": "@concat( 
                               formatDateTime(pipeline().parameters.scheduledRunTimeStart, 'yyyyMMdd'), 
                               '-',
                               formatDateTime(pipeline().parameters.scheduledRunTimeEnd, 'yyyyMMdd')
                               , '/'
                        "type": "Expression"
    }

And pass the parameters into dataset:

"parameters": {
    "scheduledRunTimeStart": {
        "type": "String"
    },
    "scheduledRunTimeEnd": {
        "type": "String"
    }
}

Upvotes: 0

Related Questions