Reputation: 3
I am looking for an easy way to dynamically copy data from JSONs to SQL tables. The JSONs all have the same setup where the data is mentioned after fields like so:
{
"name": "Employees",
"description": "Employeedata",
"fields": [
{
"id": "EmployeeId",
"fieldId": "U001",
"dataType": "string",
"label": "Employee",
"length": 15,
"controlType": 5,
"decimals": 0,
"decimalsFieldId": ""
},
{
"id": "PersonId",
"fieldId": "U003",
"dataType": "string",
"label": "EmpID",
"length": 15,
"controlType": 5,
"decimals": 0,
"decimalsFieldId": ""
},
{
"id": "EmployerId",
"fieldId": "U002",
"dataType": "string",
"label": "Employer",
"length": 15,
"controlType": 5,
"decimals": 0,
"decimalsFieldId": ""
}
]
}
]
The main structure is the same, so the relevant data is always after the fields property. However, the columns and corresponding data differs for each JSON.
When I perform the copy activity as is, the table ends up like this:
Name | Description |
---|---|
Employees | Employeedata |
What I want is something like this:
id | fieldId | datatype | label | length | controlType | decimals | decimalsFieldId |
---|---|---|---|---|---|---|---|
EmployeeId | U001 | string | Employee | 15 | 5 | 0 | |
PersonId | U003 | string | EmpID | 15 | 5 | 0 | |
EmployerId | U002 | string | Employer | 15 | 5 | 0 |
I am looking for a way to copy only the arrays after fields to an SQL table without having to manually assign a mapping for each JSON.
Upvotes: 0
Views: 245
Reputation: 51
Move JSON file to SQL DB by the copy activity, provide JSON as source.
Provide SInk as SQL database.
In the mapping field need to import schema and enable Advance editor field, need to select options in Collection reference field.
Delete these column fields to get expected result as per the requirement.
Result get as we expected in SQL Database.
Upvotes: 1