Radka Žmers
Radka Žmers

Reputation: 45

Airflow - copy BQ table to another BQ table

I need to copy a BQ table from one BQ dataset to another BQ dataset within the same project.

I found there is an operator BigQueryToBigQuery -airflow.providers.google.cloud.transfers.bigquery_to_bigquery, but I am not able to make it work on my side.

Can you help what to change?

from airflow import DAG
from astro_plugins.sensors.bigquery_sql_sensor import BigquerySqlSensor
from airflow.providers.google.cloud.transfers.bigquery_to_bigquery import BigQueryToBigQueryOperator
from dags.config import var

import datetime as dt

dag = DAG(
    # DAG = collection of tasks
    dag_id="dag_copy_bq_to_bq",
    default_args={
        "owner": "<owner>",
        "email": ["[email protected]"],
        "email_on_failure": True,
        "email_on_retry": False,
        "depends_on_past": False,
        "start_date": dt.datetime(2023, 9, 19, 7, 0, 0),
        "retries": 0,
    },
    schedule_interval="@hourly",
)

copy_table_task = BigQueryToBigQueryOperator(
    task_id='copy_table_task',
    source_project_dataset_tables='<project_id>.<source_dataset>',  # Source table
    destination_project_dataset_table='<project_id>.<destination_dataset>',  # Destination table
    write_disposition='WRITE_APPEND',  # Options: WRITE_TRUNCATE, WRITE_APPEND, WRITE_EMPTY
    create_disposition='CREATE_IF_NEEDED',
    schema_update_options=['ALLOW_FIELD_ADDITION', 'ALLOW_FIELD_RELAXATION'],
    gcp_conn_id='google_cloud_default',
    bigquery_conn_id='<project_id>.<dataset>',  # Airflow connection id to BigQuery
    #use_legacy_sql=False,
    #labels=None,
   # encryption_configuration=None,
   # location=None,
    impersonation_chain=var("astro_team_sa")
)

Upvotes: 1

Views: 287

Answers (0)

Related Questions