Reputation: 29
I want to read file from datalake in dataflow based on datetime filter and I am trying to pass start time from pipeline which getting from sql table. I tried multiways but not working.
Upvotes: 0
Views: 354
Reputation: 11284
As you are getting the required date from SQL table, use a lookup with FirstRow on your SQL table to get that value.
Convert the required value to correct date format with below expression using a string type set variable activity. Here, I have passed a hardcoded date value, in your case it will be @activity('lookup1').output.firstRow.<date_column>
in below expression.
@convertToUtc('1/16/2024 13:00:00','UTC')
This will be converted to UTC format, but it will be still string only.
In Dataflow, create a string type parameter.
Use this parameter in the StartDate
of the source with below expression.
toTimestamp($mydate,'yyyy-MM-dd\'T\'HH:mm:ss','UTC')
Also, as you want to transform multiple files, add a wildcard path like above.
Add your respective sink. Here, I have given ADLS gen2 for sample.
Now, in pipeline, pass the pipeline variable using pipeline expression to the Dataflow parameter like below.
These are sample input files in the temp
folder.
Execute the pipeline and you can see only the file sample2.csv
copied to the target as per the given UTC
.
Upvotes: 0