TimWagaman
TimWagaman

Reputation: 1008

Azure Data Factory Splitting Multiple JSON Objects In A Single File

I'm using Azure Data Factory to monitor an AWS S3 bucket that will have files containing JSON objects that are written out by an AWS process. The process may combine multiple JSON objects into a single file with no CRLF or delimiters between the objects. I need Azure Data Factory to process each of these object individually to insert them into a SQL database. I'm not finding any examples of how to process this scenario. Sorry if this is rather basic in Azure Data Factory, however, I'm rather new to the product.

Here is a sample of the file format:

{
 "AWSInfoField1": "Test Record 1", 
 "AWSInfoField2": "Just Another Field",
 "Attributes": { 
                "Attribute1": 1, 
                "Attribute2": "Another Attribute" 
                }
}
{
 "AWSInfoField1": "Test Record 2", 
 "AWSInfoField2": "Just Another Field In Record 2", 
 "Attributes": { 
                "Attribute1": 2, 
                "Attribute2": "Another Attribute In Record 2" 
               }
 }
 {
  "AWSInfoField1": "Test Record 3", 
  "AWSInfoField2": "Just Another Field In Record 3", 
  "Attributes": { 
                 "Attribute1": 3, 
                 "Attribute2": "Another Attribute In Record 3" 
                }
 }

Upvotes: 0

Views: 1457

Answers (1)

Leon Yue
Leon Yue

Reputation: 16431

I copy the data to a single file in my storage(with no CRLF or delimiters between the objects): enter image description here

I tried and found that Data Factory will auto add the default delimiter ',' to the JSON data in Source. We can see it in Source data preview: enter image description here

Then choose the SQL database as sink and mapping data to Sink table: enter image description here

Run the pipeline and check the data in table: enter image description here

Upvotes: 0

Related Questions