Reputation: 548
In my earlier post, SQL Server complains about invalid json, I was advised to use an 'appropriate methods' for building a json string, which is to be inserted into a SQL Server table for logging purposes. In the earlier post, I was using string concatenation to build a json string.
What is the appropriate tools/functions to build json within a Data Factory pipeline? I've looked into the json() and string() functions, but they would still rely on concatenation.
Clarification: I'm trying to generate a logging message that looks like this: Right now I'm using string concatenation to generate the logging json. Is there a better, more elegant (but lightweight) way to generate the json data?
{ "EventType": "DataFactoryPipelineRunActivity",
"DataFactoryName":"fa603ea7-f1bd-48c0-a690-73b92d12176c",
"DataFactoryPipelineName":"Import Blob Storage Account Key CSV file into generic SQL table using Data Flow Activity Logging to Target SQL Server",
"DataFactoryPipelineActivityName":"Copy Generic CSV Source to Generic SQL Sink",
"DataFactoryPipelineActivityOutput":"{runStatus:{computeAcquisitionDuration:316446,dsl: source() ~> ReadFromCSVInBlobStorage ReadFromCSVInBlobStorage derive() ~> EnrichWithDataFactoryMetadata EnrichWithDataFactoryMetadata sink() ~> WriteToTargetSqlTable,profile:{ReadFromCSVInBlobStorage:{computed:[],lineage:{},dropped:0,drifted:1,newer:1,total:1,updated:0},EnrichWithDataFactoryMetadata:{computed:[],lineage:{},dropped:0,drifted:1,newer:6,total:7,updated:0},WriteToTargetSqlTable:{computed:[],lineage:{__DataFactoryPipelineName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryPipelineName]}]},__DataFactoryPipelineRunId:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryPipelineRunId]}]},id:{mapped:true,from:[{source:ReadFromCSVInBlobStorage,columns:[id]}]},__InsertDateTimeUTC:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__InsertDateTimeUTC]}]},__DataFactoryName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryName]}]},__FileName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__FileName]}]},__StorageAccountName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__StorageAccountName]}]}},dropped:0,drifted:1,newer:0,total:7,updated:7}},metrics:{WriteToTargetSqlTable:{rowsWritten:4,sinkProcessingTime:1436,sources:{ReadFromCSVInBlobStorage:{rowsRead:4}},stages:[{stage:3,partitionTimes:[621],bytesWritten:0,bytesRead:24,streams:{WriteToTargetSqlTable:{type:sink,count:4,partitionCounts:[4],cached:false},EnrichWithDataFactoryMetadata:{type:derive,count:4,partitionCounts:[4],cached:false},ReadFromCSVInBlobStorage:{type:source,count:4,partitionCounts:[4],cached:false}},target:WriteToTargetSqlTable,time:811}]}}},effectiveIntegrationRuntime:DefaultIntegrationRuntime (East US)}",
"DataFactoryPipelineRunID":"63759585-4acb-48af-8536-ae953efdbbb0",
"DataFactoryPipelineTriggerName":"Manual",
"DataFactoryPipelineTriggerType":"Manual",
"DataFactoryPipelineTriggerTime":"2019-11-05T15:27:44.1568581Z",
"Parameters":{
"StorageAccountName":"fa603ea7",
"FileName":"0030_SourceData1.csv",
"TargetSQLServerName":"5a128a64-659d-4481-9440-4f377e30358c.database.windows.net",
"TargetSQLDatabaseName":"TargetDatabase",
"TargetSQLUsername":"demoadmin"
},
"InterimValues":{
"SchemaName":"utils",
"TableName":"vw_0030_SourceData1.csv-2019-11-05T15:27:57.643"
}
}
Upvotes: 0
Views: 4481
Reputation: 16401
You can using Data Flow, it help you build the JSON string within pipeline in Data Factory.
Here's the Data Flow tutorial: Mapping data flow JSON handling.
It can help you:
Hope this helps.
Upvotes: 1