alacarter
alacarter

Reputation: 359

BigQuery updates failing, but only when batched using Python API

I am trying to update a table using batched update statements. DML queries successfully execute in the BigQuery Web UI, but when batched, the first one succeeds while others fail. Why is this?

A sample query:

query = '''
update `project.dataset.Table`
set my_fk = 1234
where other_fk = 222 and
  received >= PARSE_TIMESTAMP("%Y-%m-%d %H:%M:%S", "2018-01-22 05:28:12") and 
  received <= PARSE_TIMESTAMP("%Y-%m-%d %H:%M:%S", "2018-01-26 02:31:51")
'''

Sample code:

job_config = bigquery.QueryJobConfig()
job_config.priority = bigquery.QueryPriority.BATCH

queries = [] # list of DML Strings
jobs = []
for query in queries:
    job = client.query(query, location='US', job_config=job_config)
    jobs.append(job)

Job output:

for job in jobs[1:]:
    print(job.state)
    # Done

    print(job.error_result)
    # {'message': 'Cannot set destination table in jobs with DML statements',
    # 'reason': 'invalidQuery'}

    print(job.use_legacy_sql)
    # False

    print(job.job_type)
    # Query

Upvotes: 1

Views: 2345

Answers (2)

Guillem Xercavins
Guillem Xercavins

Reputation: 7058

I suspect that the problem is job_config getting some fields populated (destination in particular) by the BigQuery API after the first job is inserted. Then, the second job will fail as it will be a DML statement with a destination table in the job configuration. You can verify that with:

for query in queries:
    print(job_config.destination)
    job = client.query(query, location='US', job_config=job_config)
    print(job_config.destination)
    jobs.append(job)

To solve this you can avoid reusing the same job_config for all jobs:

for query in queries:
    job_config = bigquery.QueryJobConfig()
    job_config.priority = bigquery.QueryPriority.BATCH
    job = client.query(query, location='US', job_config=job_config)
    jobs.append(job)

Upvotes: 2

Tamir Klein
Tamir Klein

Reputation: 3632

Your code seems to be working fine on a single update. This is what I tried using python 3.6.5 and v1.9.0 of the client API

from google.cloud import bigquery
client = bigquery.Client()

query = '''
UPDATE `project.dataset.table` SET msg = null WHERE x is null
'''

job_config = bigquery.QueryJobConfig()
job_config.priority = bigquery.QueryPriority.BATCH
job = client.query(query, location='US', job_config=job_config)

print(job.state)
# PENDING

print(job.error_result)
#  None

print(job.use_legacy_sql)
# False

print(job.job_type)
# Query

Please check your configuration and provide full code with an error log if this doesn't help you solve your problem

BTW, I also verify this from the command line

sh-3.2# ./bq query --nouse_legacy_sql --batch=true 'UPDATE `project.dataset.table` SET msg = null WHERE x is null'
Waiting on bqjob_r5ee4f5dd56dc212f_000001697d3f9a56_1 ... (133s) Current status: RUNNING
Waiting on bqjob_r5ee4f5dd56dc212f_000001697d3f9a56_1 ... (139s) Current status: DONE
sh-3.2#
sh-3.2# python --version

Upvotes: 1

Related Questions