Reputation: 134
I've got an application that stores products in a Core Data file. These pruducts include images as "Transformable" data. Now I tried adding some attributes using Lightweight migration. When I tested this with a small database it worked well but when I use a really large one with nearly 500 MB the application usually crashes because of low memory. Does anybody know how to solve this problem?
Thanks in advanced!
Upvotes: 2
Views: 1437
Reputation: 70976
You'll have to use one of the other migration options. The automatic lightweight migration process is really convenient to use. But it has the drawback that it loads the entire data store into memory at once. Two copies, really, one for before migration and one for after.
First, can any of this data be re-created or re-downloaded? If so, you might be able to use a custom mapping model from the old version to the new one. With a custom mapping model you can indicate that some attributes don't get migrated, which reduces memory issues by throwing out that data. Then when migration is complete, recreate or re-download that data.
If that's not the case... Apple suggests a multiple pass technique using multiple mapping models. If you have multiple entity types that contribute to the large data store size, it might help. Basically you end up migrating different entity types in different passes, so you avoid the overhead of loading everything at once.
If that is not the case then (e.g. the bloat is all from instances of the same entity type), well, it's time to write your own custom migration code. This will involve setting up two Core Data stacks, one with the existing data and one with the new model. Run through the existing data store, creating new objects in the new store. If you do this in batches you'll be able to keep memory under control. The general approach would be:
NSManagedObjectID
s from the old store to the new one, for use in the next step. To keep memory use low:
refreshObject:mergeChanges
with NO
for the second argument.reset
it. The interval is a balancing act-- do it too often and you'll slow down unnecessarily, do it too rarely and memory use rises.While you are at it consider why your data store is so big. Are you storing a bunch of binary data blobs in the data store? If so, make sure you're using the "Allows external storage" option in the new model.
Upvotes: 8