emie
emie

Reputation: 289

Azure Data Factory V2 Dataset Dynamic Folder

In Azure Data Factory (V1) I was able to create a slide and store the output to a specific folder (i.e. {Year}/{Month}/{Day}. See code below.

How do you create the same type of slice in Azure Data Factory V2? I did find that you have to create a paramater. Yes, I was unable to figure out how to pass the parameter.

 "folderPath": "@{dataset().path}",
"parameters": {
  "path": {
    "type": "String"

Here is original ADF V1 code.

{
    "name": "EMS_EMSActivations_L1_Snapshot",
    "properties": {
        "published": false,
        "type": "AzureDataLakeStore",
        "linkedServiceName": "SalesIntelligence_ADLS_LS",
        "typeProperties": {
            "fileName": "EMS.FACT_EMSActivations_WTA.tsv",
            "folderPath": "/Snapshots/EMS/FACT_EMSActivations_WTA/{Year}/{Month}/{Day}",
            "format": {
                "type": "TextFormat",
                "rowDelimiter": "␀",
                "columnDelimiter": "\t",
                "nullValue": "#NULL#",
                "quoteChar": "\""
            },
            "partitionedBy": [
                {
                    "name": "Year",
                    "value": {
                        "type": "DateTime",
                        "date": "SliceStart",
                        "format": "yyyy"
                    }
                },
                {
                    "name": "Month",
                    "value": {
                        "type": "DateTime",
                        "date": "SliceStart",
                        "format": "MM"
                    }
                },
                {
                    "name": "Day",
                    "value": {
                        "type": "DateTime",
                        "date": "SliceStart",
                        "format": "dd"
                    }
                },
                {
                    "name": "Hour",
                    "value": {
                        "type": "DateTime",
                        "date": "SliceStart",
                        "format": "HH"
                    }
                },
                {
                    "name": "Minute",
                    "value": {
                        "type": "DateTime",
                        "date": "SliceStart",
                        "format": "mm"
                    }
                }
            ]
        },
        "availability": {
            "frequency": "Day",
            "interval": 1
        }
    }
}

Upvotes: 0

Views: 7629

Answers (2)

kannandreams
kannandreams

Reputation: 95

Step 1: Use WindowStartTime / WindowEndTime in folderpath

"folderPath": {
"value": "<<path>>/@{formatDateTime(pipeline().parameters.windowStart,'yyyy')}-@{formatDateTime(pipeline().parameters.windowStart,'MM')}-@{formatDateTime(pipeline().parameters.windowStart,'dd')}/@{formatDateTime(pipeline().parameters.windowStart,'HH')}/",
"type": "Expression"
}

Step2 : Add in Pipeline JSON

"parameters": {
    "windowStart": {
        "type": "String"
    },
    "windowEnd": {
        "type": "String"
    }
}

Step3 : Add Run Parameter in TumblingWindow Trigger ( This is referred in Step 2 )

   "parameters": {
        "windowStart": {
            "type": "Expression",
            "value": "@trigger().outputs.windowStartTime"
        },
        "windowEnd": {
            "type": "Expression",
            "value": "@trigger().outputs.windowEndTime"
        }
    }

For more details to understand , Refer

Refer this link.

https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/data-factory/how-to-create-tumbling-window-trigger.md

Upvotes: 0

emie
emie

Reputation: 289

Here is how you create a dynamic folder path when importing data from SQL into ADL. Look at folderPath line.

{
    "name": "EBC_BriefingActivitySummary_L1_Snapshot",
    "properties": {
        "linkedServiceName": {
            "referenceName": "SIAzureDataLakeStore",
            "type": "LinkedServiceReference"
        },
        "type": "AzureDataLakeStoreFile",
        "typeProperties": {
            "format": {
                "type": "TextFormat",
                "columnDelimiter": ",",
                "rowDelimiter": "",
                "nullValue": "\\N",
                "treatEmptyAsNull": false,
                "firstRowAsHeader": false
            },
            "fileName": {
                "value": "EBC.rpt_BriefingActivitySummary.tsv",
                "type": "Expression"
            },
            "folderPath": {
                "value": "@concat('/Snapshots/EBC/rpt_BriefingActivitySummary/', formatDateTime(pipeline().parameters.scheduledRunTime, 'yyyy'), '/', formatDateTime(pipeline().parameters.scheduledRunTime, 'MM'), '/', formatDateTime(pipeline().parameters.scheduledRunTime, 'dd'), '/')",
                "type": "Expression"
            }
        }
    }
}

Upvotes: 3

Related Questions