Kirill Pyulzyu
Kirill Pyulzyu

Reputation: 381

CoreData, NSFetchedResultsController and performFetch:

I have coredata with 450.000 records.

To show it in uitableview I am using NSFetchedResultsController. It works, but with big problem.

Before NSFetchedResultsController start works, we need to call performFetch: I my case this function works about 2-3 minutes. After that I can show data in UITable without any problems. But that 2-3 minutes killing me :(

It would not be so bad, but also I need to make search in this table. So for search I need to change predicate, call performFetch: and wait about 2-3 minutes again!

Is there anyway to make performFetch: faster? Or maybe at least somebody can tell me how to make search without call performFetch:?

Upvotes: 1

Views: 564

Answers (2)

Anurag
Anurag

Reputation: 141859

2-3 minutes is a really long time to go through 450,000 records. I suspect that the most time is being spent inside some sqlite call instead of your own code. Use the Time Profiler and look at the calls that are taking the most time under the performFetch: tree.

To see if your SQL is optimized, enable SQLite logging by adding this to the "Arguments Passed on Launch" in the Scheme settings:

-com.apple.CoreData.SQLDebug 1

See this question for more details on this.

Run your app and make note of the queries that are executed when you call performFetch:. Next, read up about EXPLAIN QUERY PLAN. This is the single most helpful tool you can use if sqlite is the bottleneck. You'd need to grab the .sqlite file from your application (simulator or device), and run EXPLAIN QUERY PLAN on your query and see if is taking too long. If I had to guess, I'd say your query is doing a full table scan instead of using an appropriate index. A full table scan is costly.

If that is indeed the bottleneck, it could also help you optimize your searches.

Here are some helpful links.

  1. How can I improve core data fetch performance on the iPhone?
  2. Optimizing Core Data searches and sorts

Upvotes: 0

Mundi
Mundi

Reputation: 80265

2-3 minutes is definitely too long for a fetch for just 450.000 records. Are you parsing strings with a predicate? Make sure you used these optimization strategies:

  • Optimize your fetch by tweaking fetchBatchSize. (Try a few settings and see what works best for your data type).
  • Avoid complicated queries for the section headers. If you are calculating the section headings through some attributes of relationships, reconsider your approach. It might be better to fetch the entity that holds the section information and fill the rows in each section from there.
  • Did you index the fields by which you search and sort?
  • Wherever possible, use fetch request templates defined in the model. This will speed up things significantly.

As for search, here is a strategy that worked for me:

  • React to search only after a couple of key strokes (say, two).
  • After a valid keystroke, start a timer with, say 0.2 seconds. Only start the new fetch if there has not been another keystroke before the timer fires.
  • Fetch in the background. Reload the table view on the main thread after the fetch is finished.
  • Maintain an array of scheduled search fetches. If a scheduled fetch gets canceled by UI (another keystroke, for example), don't start it.
  • Do not use case-insensitive and diacritical-insensitive search as this is very expensive. Rather, just search from the beginning of words/names if this is feasible. If necessary, build an index with just the words (simplified, lowercase) to be searched in a separate entity as suggested in Apple's WWDC12 videos.

Upvotes: 3

Related Questions