Reputation: 647
Since ADF (Azure Data Factory) isn't able to handle complex/nested JSON objects, I'm using OPENJSON in SQL to parse the objects. But, I can't get the 'raw' JSON from the following object:
{
"rows":[
{
"name":"Name1",
"attribute1":"attribute1",
"attribute2":"attribute2"
},
{
"name":"Name2",
"attribute1":"attribute1",
"attribute2":"attribute2"
},
{
"name":"Name3",
"attribute1":"attribute1",
"attribute2":"attribute2"
}
]
}
Config 1
I get all the names listed
Result:
Config 2
When I use this config:
I get the whole JSON in one record:
Result:
Needed config
But, what I want, is this result:
Result:
So, I need the iteration of Config 1, with the raw JSON per row. Everytime I use the $['rows'], or $['rows'][0], it seems to 'forget' to iterate.
Anyone?
Upvotes: 1
Views: 7226
Reputation: 16431
The copy active can help us achieve it.
For example I copy B.json
fron container "backup" to another Blob container "testcontainer" .
This is my B.json
source dataset:
Source:
Mapping:
Pipeline executed successful:
Check the data in testcontainer:
Hope this helps.
Update:
Copy the nested json to SQL.
Source is the same B.json
in blob.
Sink dataset:
Sink:
Mapping:
Run pipeline:
Check the data in SQL database:
Upvotes: 0
Reputation: 3838
Have you tried Data Flows to handle JSON structures? We have that feature built-in with data flow transformations like derived column, flatten, and sink mapping.
Upvotes: 0