Reputation: 141
I'm new to ADF and I'm trying to use the dynamic query to build a script where the outcome of one activity is inserted into another table.
On the "Reading Data" activity, I have a simple script that is reading few records off a table from Synapse serverless SQL pool, my desired outcome is to insert that outcome into a table via "Inserting into table" on Synapse dedicated SQL pool. Here is the dynamic query I have on "Inserting into table" activity:
@concat( 'INSERT INTO schema_name.table_name', '(', ' column1, column2, column3, column4 ', ')', ' VALUES (', activity('Reading data').output , ')' )
I keep getting the error:
Parse error at line: 6, column: 1020: Incorrect syntax near ']'.
Can you help me understand where is the error? I have no "]" in any of the scripts.
Here is the output of "Reading data" activity:
{ "resultSetCount": 1, "recordsAffected": 0, "resultSets": [ { "rowCount": 1, "rows": [ { "column1": 1, "column2": "testname=", "column3": "testslug", "column4": "2022-08-15" } ] } ], "outputParameters": {}, "outputLogs": "", "outputLogsLocation": "", "outputTruncated": false, "effectiveIntegrationRuntime": "AutoResolveIntegrationRuntime (East US)", "executionDuration": 6, "durationInQueue": { "integrationRuntimeQueue": 1 }, "billingReference": { "activityType": "PipelineActivity", "billableDuration": [ { "meterType": "xxxx", "duration": 0.016666666666666666, "unit": "Hours" } ] } }
And here is the input of "Insert into table" activity:
{ "scripts": [ { "type": "Query", "text": "INSERT INTO schemaname_tablename (\ncollumn1,\nColumn2,\nColumn3,\nColumn4\n){\"resultSetCount\":1,\"recordsAffected\":0,\"resultSets\":[{\"rowCount\":1,\"rows\":[{\"id\":1,\"name\":\"testname=\",\"Slug\":\"testslug\",\"created_at\":\"2022-08-15\"}]}],\"outputParameters\":{},\"outputLogs\":\"\",\"outputLogsLocation\":\"\",\"outputTruncated\":false,\"effectiveIntegrationRuntime\":\"AutoResolveIntegrationRuntime (US)\",\"executionDuration\":6,\"durationInQueue\":{\"integrationRuntimeQueue\":1},\"billingReference\":{\"activityType\":\"PipelineActivity\",\"billableDuration\":[{\"meterType\":\"xxxx\",\"duration\":0.016666666666666666,\"unit\":\"Hours\"}]}}" } ] }
Upvotes: 1
Views: 2730
Reputation: 141
Ok after following your feedback and some more digging on Microsoft Azure docs, here is the workflow that worked:
@activity('Reading data').output.resultSets
This read the output of the "Reading data" activity in Synapse serverless sql pool.
In the ForEach -> Activity -> added a Set Variable
activity
The item value for this variable is '@item().rows'
This basically reads the object "rows"
from the resultsSets
array, which itself is an array.
3. Then connected the ForEach
activity to another Script
activity where I built this dynamic query:
`
@concat('INSERT INTO schema_name.table_name',
'(
column1,
column2,
column3,
column4
)',
' VALUES',
'(',
'''@{variables("rows")[0]}'',',
'''@{variables("rows")[1]}'',',
'''@{variables("rows")[2]}'',',
'''@{variables("rows")[3]}''',
')'
)
`
And it successfully wrote to the table in Synapse dedicated sql pool.
Upvotes: 1