Reputation: 109
I have a data factory pipeline that receives JSON files from an Azure Blob storage.
These files have the following structure:
{
"Time": {
"UTC": {
"Sent": "2020-09-01T11:45:00.0Z"
}
},
"LocalTime": {
"Original": {
"Sent": "2020-09-01T13:45:00+02:00"
}
}
}
When the lookup data activity gets the file from the blob it parses the localtime to UTC. I would like to ignore the offset and just grab the datetime as is.
How do I go about doing this?
Upvotes: 0
Views: 410
Reputation: 16431
According your comment:
We are glad to hear that you found a solution to solve it. I help you post it as answer, others can ref this way. This also can be beneficial to other community members. Thank you!
Upvotes: 1
Reputation: 1806
Thanks for posting the ask , as definitely I had never thought it would behave the way its did . For clarity of others when we try @activity('Lookup1').output.firstRow.LocalTime.Original.Sent will give the output as "2020-09-01T11:45Z" and not "2020-09-01T13:45:00+02:00".
This is what I tried while creating the Dataset , create it as if the file is delimited and not JSON ( this is what i think you are doing ) and the intend is to read the content of the whole file as string . Please make the adjustment to the Column& row limiter as shown below
Now you can use the expression ( i hate to hard code , but then we can suyre make it dynamic )
@substring(string(activity('Lookup1').output.firstRow),187,16)
Output
{
"name": "Datepart",
"value": "2020-09-01T13:45"
}
HTH
Upvotes: 0