ADF-Learner
ADF-Learner

Reputation: 29

How to pass datetime parameter from pipeline to Dataflow in Azure

I want to read file from datalake in dataflow based on datetime filter and I am trying to pass start time from pipeline which getting from sql table. I tried multiways but not working.

Upvotes: 0

Views: 354

Answers (1)

Rakesh Govindula
Rakesh Govindula

Reputation: 11284

As you are getting the required date from SQL table, use a lookup with FirstRow on your SQL table to get that value.

Convert the required value to correct date format with below expression using a string type set variable activity. Here, I have passed a hardcoded date value, in your case it will be @activity('lookup1').output.firstRow.<date_column> in below expression.

@convertToUtc('1/16/2024 13:00:00','UTC')

enter image description here

This will be converted to UTC format, but it will be still string only.

enter image description here

In Dataflow, create a string type parameter.

enter image description here

Use this parameter in the StartDate of the source with below expression.

toTimestamp($mydate,'yyyy-MM-dd\'T\'HH:mm:ss','UTC')

enter image description here

Also, as you want to transform multiple files, add a wildcard path like above.

Add your respective sink. Here, I have given ADLS gen2 for sample.

Now, in pipeline, pass the pipeline variable using pipeline expression to the Dataflow parameter like below.

enter image description here

These are sample input files in the temp folder.

enter image description here

Execute the pipeline and you can see only the file sample2.csv copied to the target as per the given UTC.

enter image description here

Upvotes: 0

Related Questions