Reputation: 3283
I have a really frustrating error trying to parse basic Json read from Blob Storage using a data set within ADF
My Json is below
[{"Bid":0.197514880839,"BaseCurrency":"AED"}
,{"Bid":0.535403560434,"BaseCurrency":"AUD"}
,{"Bid":0.351998712241,"BaseCurrency":"BBD"}
,{"Bid":0.573128306234,"BaseCurrency":"CAD"}
,{"Bid":0.787556605631,"BaseCurrency":"CHF"}
,{"Bid":0.0009212964,"BaseCurrency":"CLP"}
,{"Bid":0.115389497248,"BaseCurrency":"DKK"}
]
I have tried all 3 Json source settings and every one of them gives the error
Malformed records are detected in schema inference. Parse Mode: FAILFAST
The 3 settings as in
Single Document
Array Of Documents
Document Per Line
Can anyone help? I just simply need this to be a list of objects thats it!
Paul
Upvotes: 5
Views: 7430
Reputation: 1
What if Structure is like this-
{
"GroupId": 1,
"SubGroups": [
{
"GroupId": 101,
"SubGroups": [
{
"GroupId": 10101,
"SubGroups": [],
"Type": "AA",
"Name": "Name1"
}
],
"Type": "A",
"Name": "Name2"
},
{
"GroupId": 102,
"SubGroups": [],
"Type": "B",
"Name": "Name3"
}
}
Upvotes: 0
Reputation: 21
We face this issue when we have the json file with UTF-8 BOM encoding, the ADF DataFlow is unable to parse such files. You can specify the encoding as UTF-8 without encoding while creating the file, it will work.
In my case, I am using copy activity to merge and create the json file and have specified encoding as UTF-8 without BOM, and it resolved my issue.
Note: For some reason, we cant use the dataset which has "UTF-8 without BOM" encoding in DataFlow, in that case, you can create two datasets one with default UTF-8 encoding (which will be used in DataFlow) and one with UTF-8 without BOM(which will be used in copy activity sink/while creating a file).
Thank you.
Upvotes: 2