Manuel Valero
Manuel Valero

Reputation: 445

BigQuery: Unexpected behaviour using bigquery job when writing query result

I'm currently using Bigquery jobs to adapt some data and load it in other table.

My bigquery job reads from a table, using query jobs, and then write it in other table. The job executes successful and the job state is done, but any row is loaded.

This is the code:

table_id_from = "table_from"

table_ref_to = bigquery_client.dataset('format').table("table_to")

job_config = bigquery.LoadJobConfig()

job_config.create_disposition = 'NEVER'
job_config.destination = table_ref_to_format
job_config.write_disposition = 'WRITE_APPEND'
job_config.use_legacy_sql = False

# Start the query, passing in the extra configuration.
query = """SELECT id, name, short_name,
    subdomain, address, address2, department, state, zip
    from staging.%s;""" %(table_id_from)

query_job = bigquery_client.query(query, job_config=job_config)

rows_from_staging = list(query_job)  # Waits for the query to finish
print(len(rows_from_staging))
# assert query_job.state == 'RUNNING'
# assert query_job.job_type == 'query'

iterator = bigquery_client.list_rows(
    table_ref_to_format, selected_fields=[bigquery.SchemaField('id', 'INTEGER')])
rows = list(iterator)
print(len(rows))
print(query_job.state)
query_job.result()

The result of the first part, when reading from the table, prints len 3. In the other hand, when querying the destination table, it doesn't read anything and prints 0 as the len of the rows.

3
0
DONE

What is happening? I expected to throw me an error if something was wrong, but it runs successful. Any help?

Upvotes: 1

Views: 191

Answers (1)

SaadK
SaadK

Reputation: 256

You are using LoadJobConfig() instead of QueryJobConfig(). If you change it, this will work fine.

Upvotes: 2

Related Questions