Reputation: 689
I have a table with many (hundreds) & later will be thousands of update queries that I would like to execute from the datalab.
the code is the following: Reading the table with the commands:
%%sql --module std_sql_table
WITH q1 as (
SELECT * FROM `speedy-emissary-167213.pgp_orielresearch.update_queries`
)
select * from q1
import datalab.bigquery as bq
#uses panda for the dataframe
query_table_df = bq.Query(std_sql_table).to_dataframe(dialect='standard',use_cache=True)
print(query_table_df.head(10))
col_name = list(query_table_df) # the name of the column
print(col_name)
#THIS LOOP IS FOR THE UPDATE COMMAND ROWS THAT I WANT TO EXECUTE
#for index, row in query_table_df.iterrows():
#print "running " + row[col_name]
#row_query = row[col_name]
#query_result_row_df = bq.Query(row_query).to_dataframe(dialect='standard',use_cache=True)
The output is the following, I would like to execute every row in the table:
0 UPDATE speedy-emissary-167213.pgp_orielresear...
1 UPDATE
speedy-emissary-167213.pgp_orielresear...
2 UPDATE speedy-emissary-167213.pgp_orielresear...
3 UPDATE
speedy-emissary-167213.pgp_orielresear...
4 UPDATE speedy-emissary-167213.pgp_orielresear...
5 UPDATE
speedy-emissary-167213.pgp_orielresear...
6 UPDATE speedy-emissary-167213.pgp_orielresear...
7 UPDATE
speedy-emissary-167213.pgp_orielresear...
8 UPDATE speedy-emissary-167213.pgp_orielresear...
9 UPDATE
speedy-emissary-167213.pgp_orielresear...
[u'f0_']
any idea is very welcomed!
Upvotes: 2
Views: 279
Reputation: 195
Please read the following doc: https://cloud.google.com/bigquery/docs/reference/standard-sql/data-manipulation-language
Essentially you need to merge the update statements, or you will hit quota issue, pay much more than needed and get worse performance. Bigquery is good for analytics, but it should not be treated as general-purpose database.
Upvotes: 1