Reputation: 113
I have a ADF Pipeline that is connecting to Salesforce and is simply copying the object over to my Azure DB Storage. I would like to add a timestamp in the table that is simply the day/time that this procedure ran. Similarly to if you used GETDATE() in TSQL.
I was wondering if there was a way i could do this in the mapping text? Or do I need to create a SP that goes through and updates the Timestamps after this runs?
Upvotes: 0
Views: 1238
Reputation: 1
You can add additional columns in the Source tab. Add a new additional column and set its value to @pipeline().TriggerTime
as dynamic content.
Then, in the Mapping tab, make sure you map this new column to your target/destination column.
Upvotes: 0
Reputation: 113
Thank you Leon Yue that is a great answer but I found another way to do this that is easier for me at least.
I simply changed my Timestamp to have a "default" value. This works because I am truncating the table and then inserting everytime. Your answer would work better I think if I wasn't truncating. Below is my default value code, i have to convert to Eastern since the default for our system is UTC.
[TimeStamp] [datetime] Default CONVERT(datetime, SWITCHOFFSET(Getdate(), DATEPART(TZOFFSET, getdate() AT TIME ZONE 'Eastern Standard Time')))
Upvotes: 0
Reputation: 16401
As I know about Data Factory, you can not do that in the mapping text. Data Flow can but it doesn't support Salesforce as Source dataset.
You should first edit your Azure SQL table schema, add the column to store the pipeline running timestamp.
Then create a stored procedure to store the object data from Salesforce, get current date as timestamp to new column.
When you call the stored procedure from Data Factory, the new current date is the day/time that this procedure ran.
Hope this helps.
Upvotes: 1