Marc Frame
Marc Frame

Reputation: 1001

Delete all rows in big query table

I have a table that I use for unit tests where I run some code, query the table check that the data was successfully added and then delete the full contents of the table.

How would i go about deleting every single row in the table?

I would rather not have to recreate the table.

efficiency isn't a factor as I'm only dealing with less than 10 rows each unit test.

Upvotes: 3

Views: 5136

Answers (3)

Mapad
Mapad

Reputation: 8537

You can set write_disposition to WRITE_TRUNCATE in order to erase the table before inserting. See documentation. In order to use this, you'll have to use a job to insert your data. I could not find a way to make this work with streaming inserts.

Here is an example in python:

job_config = bigquery.LoadJobConfig()
job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE
dataset_ref = bigquery.DatasetReference(project, 'my_dataset')
table_ref = dataset_ref.table(table_id)
with open(filepath, "rb") as source_file:
    job = client.load_table_from_file(
        source_file,
        table_ref,
        job_config=job_config,
    )  # API request
    job.result() # Waits for table load to complete.

Upvotes: 0

Felipe Hoffa
Felipe Hoffa

Reputation: 59175

This alternative doesn't charge the 10MB min:

CREATE OR REPLACE TABLE `temp.replaceable`
AS 
SELECT *
FROM `temp.replaceable`
LIMIT 0

Upvotes: 4

Marc Frame
Marc Frame

Reputation: 1001

DELETE FROM `projectid.datasetid.tableid` WHERE TRUE

Upvotes: 5

Related Questions