Reputation: 133
Is it possible to do a bigquery scripting in airflow BigQueryOperator (airflow 1.10.12) ? Does someone manage to do it ? I tried somrthing like that :
test = BigQueryOperator(
task_id='test',
sql="""DECLARE aaa STRING;
SET aaa = 'data';
CREATE OR REPLACE TABLE `project-id.dataset-id.TEST_DDL` as select aaa as TEST;""",
use_legacy_sql = False,
create_disposition=False,
write_disposition=False,
schema_update_options=False,
location='EU')
But all I get is a 'Not found: Dataset was not found in location US at [3:9]'
Upvotes: 0
Views: 825
Reputation: 133
Actually I found the issue and it IS relative to thé bigqueryoperator. Actually when scripting there is ni referenced tables neither destination table in the bigquery insert job. In that case bigquery sets the job location in US by default. In my case as my datasets are in EU thé job fails. And there is a location parameter in the bigqueryoperator but it is wrongly passed by the operator in the configuration object of the job instead if in the job reference object. Which made it useless. The issue is corrected in airflow 2.
Upvotes: 1