user2611300
user2611300

Reputation: 133

Bigquery: Does huge amount of tables in a dataset impact performance?

I am currently using big query to store the user information to compute aggregate results against huge log data . But since modifying the data is not possible. In order to overcome this I am planning to store each user record in separate table. I understand bigquery supports querying from multiple tables using which i can get all information. My doubt over here are


Thanks in advance

Upvotes: 1

Views: 171

Answers (1)

Mikhail Berlyant
Mikhail Berlyant

Reputation: 172944

From what I know - there is no hard limit on number of tables in dataset.
At the same time - Native BQ UI has limit of first 10,000 tables in dataset to show.

Another limits to consider (just few to mention):
* Daily update limit: 1,000 updates per table per day;
* Query (including referenced views) can reference up to 1,000 tables and not more;
* Each additional table involved in a query (with hundreds and hundreds tables) makes considerable impact on performance.
* even if each table is small enough - it still will be charged at min price of 10MB (even if it is just few KB)

Not knowing your exact scenario doesnt allow making some recommendation, but at least you've got answer on those items in your question.

Overall, idea of having table per user doesn't sound good to me

Upvotes: 4

Related Questions