Reputation: 1
I'm currently working on a project where I need the data factory pipeline to copy based off the last run date.
The process breakdown....
Upvotes: 0
Views: 577
Reputation: 1806
I think reading the post a couple of time , what I understood is
If I understand the ask correctly . please use the @dayOfWeek() function . Add a If statement and let the current logic only execute when the day of the week is Monday(2)-Friday(6) .
https://learn.microsoft.com/en-us/azure/data-factory/data-flow-expressions-usage#dayofweek
Upvotes: 0
Reputation: 7156
Instead of adding 1 to last high water mark value, we can try to update current UTC as watermark value. So that, even when pipeline is not triggered data will be copied to the correct destination folder. I have tried to repro in my environment and below is the approach.
select * from tab1 where lastmodified > '@{activity('Lookup1').output.firstRow.watermark_value}'
@concat(formatDateTime(utcnow(),'yyyy'),'/', formatDateTime(utcnow(),'mm'),'/',formatDateTime(utcnow(),'dd'))
is given in folder path.update watermark_table
set
watermark_value='@{formatDateTime(utcnow(),'yyyy-MM-dd')}'
where tab_name='tab1'
Upvotes: 0