Reputation: 131
I have telemetry data sent continuously to my Azure SQL database, which ends up in its landing table under the format:
INSERT INTO landing (ID, rawData)
VALUES
('Device1', [{"value1":"str1","value2":-109.0,"value3":3,"uselessvalue":16.0},{"value1":"str2","value2":-111.0,"value3":3,"uselessvalue":21.0},{"value1":"str3","value2":-113.0,"value3":2,"uselessvalue":18.0}]);
The amount of JSONs in the list can vary from 1 to several dozens.
I am trying to separate these values in definite rows in another table, so that they can be sent this way (within a trigger probably) :
INSERT INTO parsedData (ID, value1, value2, value3)
VALUES
('Device1', 'str1', -109.0, 3),
('Device1', 'str2', -111.0, 3),
('Device1', 'str3', -113.0, 2);
I have found several methods (split on commas, use WHILE loops, using keywords unsupported in Azure such as UNNEST) but I cannot find a way so far.
Upvotes: 0
Views: 1294
Reputation: 222592
In SQL Server, you can unnest a JSON array with openjson()
:
insert into parseddata (id, value1, value2, value3)
select l.id, x.*
from landing l
cross apply openjson(l.rawdata) with (
value1 varchar(50) '$.value1',
value2 decimal(5, 2) '$.value2',
value3 int '$.value3'
) x
I had to make a few assumptions about the datatype within the json objects, that you might need to review.
Upvotes: 1