user1526892
user1526892

Reputation: 67

Update in Lookup Activity, in Azure data factory

I am using the below flow.

 ODATA -> Blob storage (JSON)
 JSON -> Snowflake table
 Copy Data -> Copy Data - Lookup
    

Both copy data is working fine.

In the lookup (query), i have given. (Need to add 1 value in table, its a variant column)

Update T1 set source_json = object_insert(source_json,device_type,web_browser,TRUE);) 

             

When i use the above query in snowflake database it works fine, the table has 25K rows.

When run from pipeline, it gives the below error.

Multiple SQL statements in a single API call are not supported; use one API call per statement instead.

Any suggestions please.

Upvotes: 0

Views: 940

Answers (2)

user1526892
user1526892

Reputation: 67

Thanks for the reply. Requirement got changed. Our flow Lookup1 -> Copy data -> Copy data > Lookup2 We passed the values from the lookup1 and ran the stored procedure.

Upvotes: 0

Abhishek Khandave
Abhishek Khandave

Reputation: 3230

Some of the workarounds are provided below.

Execute multiple SQL files using SnowSql (command line utility) as described below:

snowsql -c cc -f file1.sql -f file2.sql -f file3.sql

Once we have downloaded and installed the snowsql tool, we can wrap up all our SQL queries in a single .sql file and call that file using bash.

For example, suppose that we have written all the queries which we would like to run around in a file named abc.sql stored in /tmp.

We can then run the following command:

snowsql -a enter_accountname -u enter_your_username -f /tmp/abc.sql

For reference:

Workaround for multiple sql statement in a single api call are not supported

Multiple single api call are not supported use one api call per statement instead

Upvotes: 0

Related Questions