Reputation: 2406
My Google cloud sql table have 1126571 rows currently and adding minimum 30 thousand every day.When execute the query :
select count(distinct sno) as tot from visits
sql prompt it will generate following error:
Error 0: Unable to execute statement
. Is Cloud SQL Query liable to 60 seconds exceed exception. How can overcome the problem when the table become large.
Upvotes: 7
Views: 882
Reputation: 423
Try to split your select query for many parts, for example, the first select query must be limited to 50000, and then the second select query must be started from 50000 and limited to 50000 and so on.
You can do that by this scenario :
1- Get records count.
2- Make a loop and make it end at the records count.
3- For each loop, make the select query select 50000 records and append the results to a datatable (depends on what's your programming language)
4- In the next loop, you must start selecting from where previous loop ended, for example, the second query must select the next 50000 records and so on.
You can specify your select starting index by this SQL query statement:
SELECT * FROM mytable somefield LIMIT 50000 OFFSET 0;
Then you will get the whole data that you want.
NOTE : make a test to see what's the maximum records count can be loaded in 60 sec, this will decrease your loops and therefore, increased performance.
Upvotes: 0
Reputation: 5287
Break the table into two tables. One to receive new visits ... transactions ... one for reporting. Index the reporting table. Transfer and clear data on a regular basis.
The transaction table will remain relatively small and thus it will be fast to count. The reporting table will be fast to count because of the index.
Upvotes: 1
Reputation: 263733
add an INDEX
in your column sno
and it will improve its performance.
ALTER TABLE visits ADD INDEX (sno)
Upvotes: 0