jagadeesh
jagadeesh

Reputation: 252

Core Data Performance issue while Saving

I'm using core data of NSSqliteStoreType to store data in my iOS app. I need to store large amount of objects into database. To improve performance of Core data, I'm considering so many considerations like:

  1. Saving Batch wise
  2. Saving only after for loop ends
  3. Clearing Context to manage Memory

But it is taking so much time to save 100k objects. Please suggest me best practices to improve Performance of Core data while saving large amounts of data.

Upvotes: 2

Views: 774

Answers (2)

Marcus S. Zarra
Marcus S. Zarra

Reputation: 46718

You should consider shipping your app with data pre-populated to avoid most of the overhead of the import. Assuming the data is static enough (most data is) you can pre-load all of the data up to the point of shipping the app and then when the app launches it only needs to grab data from the ship date forward (or last refresh date forward).

As Leonid Usov said, you should also do the import on a background context and save to disk in batches. That will help keep memory down and UI performance up. But at the end of the data, importing a lot of data is intensive and should be avoided by pre-loading as much as possible.

Upvotes: 0

Leonid Usov
Leonid Usov

Reputation: 1598

  1. You should do the import on a non-UI thread with a context bound directly to the persistent store coordinator, not a child context of the main contex
  2. You should invoke [managedObjectContext save] once in every several hundreds of new objects inserted, depending on the object size and graph complexity. See this answer for details
  3. You should wrap your batch from step 2 in an @autoreleasepool block and reset the context after save before the autorelease block is exited. See this answer

Upvotes: 3

Related Questions