BAR
BAR

Reputation: 17151

BigQuery Row Limits

Google says BigQuery can handle billions of rows.

For my application I estimate a usage of 200,000,000 * 1000 rows. Well over a few billion.

I can partition data into 200,000,000 rows per partition but the only support for this in BigQuery seems to be different tables. (please correct me if I am wrong)

The total data size will be around 2TB.

I saw in the examples some large data sizes, but the rows were all under a billion.

Can BigQuery support the number of rows I am dealing with in a single table?

If not, can I partition it in any way besides multiple tables?

Upvotes: 4

Views: 22282

Answers (2)

Jeremy Condit
Jeremy Condit

Reputation: 7046

Short answer: Yes, BigQuery will handle this just fine, even if you put all the data in a single table.

If you do want to partition your data, the only way to do it right now is to explicitly store your data in multiple tables. You might consider doing so to reduce your bill if you frequently query only a subset of your data. Many users partition their data by date and use table wildcard functions to write queries across a subset of those partitioned tables.

Upvotes: 4

Mikhail Berlyant
Mikhail Berlyant

Reputation: 173171

Below should answer your question

I run it agains one of our dataset
As you can see tables size close to 10TB with around 1.3-1.6 Billion rows

SELECT 
  ROUND(size_bytes/1024/1024/1024/1024) as TB, 
  row_count as ROWS
FROM [mydataset.__TABLES__] 
ORDER BY row_count DESC
LIMIT 10

I think the max table we dealt so far was at least up to 5-6 Billion and all worked as expected

Row   TB        ROWS     
1   10.0    1582903965   
2   11.0    1552433513   
3   10.0    1526783717   
4    9.0    1415777124   
5   10.0    1412000551   
6   10.0    1410253780   
7   11.0    1398147645   
8   11.0    1382021285   
9   11.0    1378284566   
10  11.0    1369109770   

Upvotes: 9

Related Questions