Reputation: 1173
I have a Db with more than 100,000 records, when I want to select 50 records (by using limit) from db it takes time more than 10 seconds, some times 18 seconds in old devices. Any idea for this problem? Such using magical primary key or ...?
I think I must separate my data into some tables or some databases,because my code is fine,and retrieve data from sqlite engine is so slow for this amount of data.
My table structure is:
createdb="CREATE TABLE `sms` ("
+ " `_id` INTEGER PRIMARY KEY AUTOINCREMENT, "
+ "`categorys` INTEGER NOT NULL, "
+ " `text` text NOT NULL, "
+ " `rate` INTEGER OT NULL,"
+ " `sended` INTEGER OT NULL,"
+ " `deleted` INTEGER OT NULL,"
+ " `liked` INTEGER OT NULL,"
+ " `readed` INTEGER OT NULL,"
+ " `shared` INTEGER OT NULL,"
+ " `length` INTEGER OT NULL"
+ ");";
startInsertStr="insert into sms (text,categorys,rate,sended,deleted,liked,readed,shared,length) VALUES('";
endInsertStr = ",-1,-1,-1,-1,-1,0);\n";
I read data from file, insert them into db with above structure, but when I select something like this:
select * from sms where categorys=5 order by rate desc limit 0,50
it takes more than 10 seconds time to retrieve data.
Upvotes: 0
Views: 1097
Reputation: 180927
Seems like an index would speed things up significantly;
CREATE INDEX `ix_categorys_rate_d` ON `sms`(`categorys`, `rate` DESC);
If you don't have an index, the query will need to scan all rows for results, while an index will allow it to find all rows in the category very efficiently and already keep them sorted by rate DESC
.
Upvotes: 3