Reputation: 11
I receive an Avro file in my Data Lake Store thru streaming analytics and an event hub using capture.
The structure of the file looks like this:
[{"id":1,"pid":"abc","value":"1","utctimestamp":1537805867},{"id":6569,"pid":"1E014000","value":"-5.8","utctimestamp":1537805867}] [{"id":2,"pid":"cde","value":"77","utctimestamp":1537772095},{"id":6658,"pid":"02002001","value":"77","utctimestamp":1537772095}]
I've used this script:
@rs =
EXTRACT
SequenceNumber long,
Offset string,
EnqueuedTimeUtc string,
Body byte[]
FROM @input_file
USING new Microsoft.Analytics.Samples.Formats.ApacheAvro.AvroExtractor(@"
{
""type"": ""record"",
""name"": ""EventData"",
""namespace"": ""Microsoft.ServiceBus.Messaging"",
""fields"": [
{
""name"": ""SequenceNumber"",
""type"": ""long""
},
{
""name"": ""Offset"",
""type"": ""string""
},
{
""name"": ""EnqueuedTimeUtc"",
""type"": ""string""
},
{
""name"": ""SystemProperties"",
""type"": {
""type"": ""map"",
""values"": [
""long"",
""double"",
""string"",
""bytes""
]
}
},
{
""name"": ""Properties"",
""type"": {
""type"": ""map"",
""values"": [
""long"",
""double"",
""string"",
""bytes"",
""null""
]
}
},
{
""name"": ""Body"",
""type"": [
""null"",
""bytes""
]
}
]
}
");
@jsonify = SELECT Microsoft.Analytics.Samples.Formats.Json.JsonFunctions.JsonTuple(Encoding.UTF8.GetString(Body)) AS message FROM @rs;
@cnt = SELECT message["id"] AS id,
message["id2"] AS pid,
message["value"] AS value,
message["utctimestamp"] AS utctimestamp,
message["extra"] AS extra
FROM @jsonify;
OUTPUT @cnt TO @output_file USING Outputters.Text(quoting: false);
The script results in a file but only with delimiting comma's in it and no values.
How do I extract / transform this structure so I can output it as a flattened 4 column csv file?
Upvotes: 1
Views: 272
Reputation: 14389
I got this to work by exploding the JSON column again and applying the JsonTuple
function again (however I suspect it could be simplified):
@jsonify =
SELECT JsonFunctions.JsonTuple(Encoding.UTF8.GetString(Body)) AS message
FROM @rs;
// Explode the tuple as key-value pair;
@working =
SELECT key,
JsonFunctions.JsonTuple(value) AS value
FROM @jsonify
CROSS APPLY
EXPLODE(message) AS y(key, value);
Full script:
REFERENCE ASSEMBLY Avro;
REFERENCE ASSEMBLY [Newtonsoft.Json];
REFERENCE ASSEMBLY [Microsoft.Analytics.Samples.Formats];
USING Microsoft.Analytics.Samples.Formats.Json;
DECLARE @input_file string = @"\input\input21.avro";
DECLARE @output_file string = @"\output\output.csv";
@rs =
EXTRACT
Body byte[]
FROM @input_file
USING new Microsoft.Analytics.Samples.Formats.ApacheAvro.AvroExtractor(@"{
""type"": ""record"",
""name"": ""EventData"",
""namespace"": ""Microsoft.ServiceBus.Messaging"",
""fields"": [
{
""name"": ""SequenceNumber"",
""type"": ""long""
},
{
""name"": ""Offset"",
""type"": ""string""
},
{
""name"": ""EnqueuedTimeUtc"",
""type"": ""string""
},
{
""name"": ""SystemProperties"",
""type"": {
""type"": ""map"",
""values"": [
""long"",
""double"",
""string"",
""bytes""
]
}
},
{
""name"": ""Properties"",
""type"": {
""type"": ""map"",
""values"": [
""long"",
""double"",
""string"",
""bytes"",
""null""
]
}
},
{
""name"": ""Body"",
""type"": [
""null"",
""bytes""
]
}
]
}");
@jsonify =
SELECT JsonFunctions.JsonTuple(Encoding.UTF8.GetString(Body)) AS message
FROM @rs;
// Explode the tuple as key-value pair;
@working =
SELECT key,
JsonFunctions.JsonTuple(value) AS value
FROM @jsonify
CROSS APPLY
EXPLODE(message) AS y(key, value);
@cnt =
SELECT value["id"] AS id,
value["id2"] AS pid,
value["value"] AS value,
value["utctimestamp"] AS utctimestamp,
value["extra"] AS extra
FROM @working;
OUTPUT @cnt TO @output_file USING Outputters.Text(quoting: false);
My results:
Upvotes: 2