Reputation: 9
Am testing out Google BigQuery to replace traditional databases we have used in the past (PostgreSQL / MySQL) but have found the performance to be extremely (and oddly) slow.
Uploaded a ~6mb dataset (~44,000 rows) to test.
Tried to run a few simple queries:
SELECT Sub_Category, COUNT(*) AS COUNT
FROM rnd-projects-247203
.test.data
GROUP BY Sub_Category
Can someone help explain to me why such a simple query on a small dataset takes over 20 seconds to run? Have I done something wrong in the setup / need to do something differently?
Thanks!
Added: Execution details expanded under S00
Data is sourced and loaded in from a Google Drive .csv file
Execution details screenshot - expanded
Upvotes: 0
Views: 910
Reputation: 207912
BigQuery is a petabyte scale data warehouse. It's best complementary to a traditional database. It should not be considered in place of MySql/Postgres.
BigQuery is really fast for large scale queries, like 3-10 seconds for terabyte/petabyte scale. For small queries the run time is around 1-2 seconds as well. Anyway is not 20 miliseconds as on transactional databases.
Upvotes: 1