Akash Malhotra
Akash Malhotra

Reputation: 1106

sqlite fetching on ios device slow but fast on simulator

I have a database with 3 tables having 700000 records each. I have added a search feature in the app which uses query...

const char *sqlstatement="select * from artist where name like ?";
        sqlite3_stmt *compliedstatement;
        if(sqlite3_prepare_v2(database,sqlstatement , -1, &compliedstatement, NULL)==SQLITE_OK)
        {
            sqlite3_bind_text(compliedstatement,1,[[NSString stringWithFormat:@"%%%@%%",self.searchBar.text] UTF8String], -1, SQLITE_STATIC);

This was OK on the simulator but took about a minute on the iOS device. So i used Sqlite Manager to add indexes to the table columns, the database size grew form 76mb to 166mb but now this query takes about 1 to 2 seconds on simulator to run, and abt 10 to 15 seconds on device. SO its an improvement but still any suggestions to further improve it? NO I cannot use CoreData at this point of time.

Upvotes: 0

Views: 1601

Answers (2)

kuntergunt
kuntergunt

Reputation: 11

Use the FTS feature of SQLite. FTS is enabled by default and will solve the query performance issue with the "like". You need to add alle rows to an FTS virtual table. Then use "match" instead of "like". See here the documentation: http://www.sqlite.org/fts3.html. You can expect performance to be in ms instead of seconds.

Upvotes: 1

Philip Sheard
Philip Sheard

Reputation: 5825

The first point to note is that SQLite will not use an index for LIKE clauses, even if they are of the form LIKE '...%' (e.g. LIKE 'Fred%'), the point being that most collation sequences are case-sensitive. The improvement in performance that you observe is due to SQLite using what is called an index search - it can search the index rather than having to plough through the whole table.

15 seconds for 700,000 records is not bad, in fact it is about what I am getting myself, for a very similar query. To get any improvement on this, you would have to look into doing a full text search (FTS). This involves a lot more work than simply adding an index, but it may be the way to go for you.

If they query returns a lot of rows, and you are using these to populate a table view, there may be another performance issue due to having to process so many rows, even if you are only pulling off the row ids. My solution was to limit the number of rows fetched to about 5,000, on the grounds that nobody would want to scroll through more than that.

Upvotes: 1

Related Questions