Harish Nandoliya
Harish Nandoliya

Reputation: 101

How to optimise google BigQuery with 17+ tables which contains approx. 55 GB of data?

I have a huge amount of data store which contains almost 20+ tables. all tables contain data in GB.

So basically I'm exporting all data into CSV for analysis. I have 17+ tables in join query which almost process billions of records. Google says it will process 10 GB data.

Now the problem is query taking too much time & resources, sometimes query fails with resource limit. how can I optimize such a query?

FYI: I'm using LEFT JOIN

Upvotes: 0

Views: 45

Answers (1)

Vibhor Gupta
Vibhor Gupta

Reputation: 699

Best way to optimize your query is implement Partitioning & Clustering. Best solution is to implement partitioning and Clustering on fields over which Joining conditions are done.

Upvotes: 1

Related Questions