Reputation: 77
json structureData Flow is not able to read the API response json file stored in blob storage, if the file is manually placed in the same location it works fine, but for the json api response dataflow says corrupted file? In dataset I'm able to preview the file but in data flow it doesn't work.
Upvotes: 1
Views: 778
Reputation: 97
I had similar sort of issue. The same dataset that saves the file on Blob storage wasn't able to read it back through dataflow. However, preview from dataset works great. So, I checked the saved file encoding and found somehow it is getting saved with UTF-8-BOM encoding even when my dataset is configured to save it as default UTF-8. Changing the encoding back to UTF-8 in my notepad++ and after replacing old with new in blob worked great using the same dataset. But this was not a permanent solution.
For permanent solution, I created a new dataset with same UTF-8 as default encoding for dataflow. Now I have two separate datasets, one for writing to blob and other for reading from blob. This solution worked as a permanent fix. I don't know what the difference it made by having separate dataset for reading but I am glad that it worked for me.
Good luck!!!
Upvotes: 0
Reputation: 9519
I think your problem is JSON parsing error, see: Error code: DF-Executor-SystemInvalidJson.
- Message: JSON parsing error, unsupported encoding or multiline
- Causes: Possible issues with the JSON file: unsupported encoding, corrupt bytes, or using JSON source as single document on many nested lines
- Recommendation: Verify the JSON file's encoding is supported. On the Source transformation that is using a JSON dataset, expand 'JSON Settings' and turn on 'Single Document'.
Upvotes: 1