Reputation: 1001
I have a table that I use for unit tests where I run some code, query the table check that the data was successfully added and then delete the full contents of the table.
How would i go about deleting every single row in the table?
I would rather not have to recreate the table.
efficiency isn't a factor as I'm only dealing with less than 10 rows each unit test.
Upvotes: 3
Views: 5136
Reputation: 8537
You can set write_disposition to WRITE_TRUNCATE
in order to erase the table before inserting. See documentation. In order to use this, you'll have to use a job to insert your data. I could not find a way to make this work with streaming inserts.
Here is an example in python:
job_config = bigquery.LoadJobConfig()
job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE
dataset_ref = bigquery.DatasetReference(project, 'my_dataset')
table_ref = dataset_ref.table(table_id)
with open(filepath, "rb") as source_file:
job = client.load_table_from_file(
source_file,
table_ref,
job_config=job_config,
) # API request
job.result() # Waits for table load to complete.
Upvotes: 0
Reputation: 59175
This alternative doesn't charge the 10MB min:
CREATE OR REPLACE TABLE `temp.replaceable`
AS
SELECT *
FROM `temp.replaceable`
LIMIT 0
Upvotes: 4