Reputation: 252
I'm using core data of NSSqliteStoreType to store data in my iOS app. I need to store large amount of objects into database. To improve performance of Core data, I'm considering so many considerations like:
But it is taking so much time to save 100k objects. Please suggest me best practices to improve Performance of Core data while saving large amounts of data.
Upvotes: 2
Views: 774
Reputation: 46718
You should consider shipping your app with data pre-populated to avoid most of the overhead of the import. Assuming the data is static enough (most data is) you can pre-load all of the data up to the point of shipping the app and then when the app launches it only needs to grab data from the ship date forward (or last refresh date forward).
As Leonid Usov said, you should also do the import on a background context and save to disk in batches. That will help keep memory down and UI performance up. But at the end of the data, importing a lot of data is intensive and should be avoided by pre-loading as much as possible.
Upvotes: 0
Reputation: 1598
[managedObjectContext save]
once in every several hundreds of new objects inserted, depending on the object size and graph complexity. See this answer for details@autoreleasepool
block and reset the context after save before the autorelease block is exited. See this answerUpvotes: 3