Reputation: 400
I have a table in Big Query which is getting daily updated with new rows. I have created a new partitioned table using Partition by Date with date column to reduce execution time and cost. But, I need to daily automatically update the partitioned table with new data as well. How should this be implemented ? Iam a newbie in Bigquery, thus need help.
Upvotes: 0
Views: 339
Reputation: 1828
You can use the below code to Load data into a column-based time partitioning table.
from google.cloud import bigquery
# Construct a BigQuery client object.
client = bigquery.Client()
# TODO(developer): Set table_id to the ID of the table to create.
# table_id = "your-project.your_dataset.your_table_name"
job_config = bigquery.LoadJobConfig(
schema=[
bigquery.SchemaField("name", "STRING"),
bigquery.SchemaField("post_abbr", "STRING"),
bigquery.SchemaField("date", "DATE"),
],
skip_leading_rows=1,
time_partitioning=bigquery.TimePartitioning(
type_=bigquery.TimePartitioningType.DAY,
field="date", # Name of the column to use for partitioning.
expiration_ms=7776000000, # 90 days.
),
)
uri = "gs://cloud-samples-data/bigquery/us-states/us-states-by-date.csv"
load_job = client.load_table_from_uri(
uri, table_id, job_config=job_config
) # Make an API request.
load_job.result() # Wait for the job to complete.
table = client.get_table(table_id)
print("Loaded {} rows to table {}".format(table.num_rows, table_id))
For more information about the partitioned tables you can refer this document.
Upvotes: 1