Reputation: 143
I'm loading a large amount of data in JSON format (more than 2000 entities) into Core Data whenever user refreshes the page. What I'm doing right now works fine but just time consuming. I was considering to use some kind of pagination, but that needs backend modifications. Hopefully someone could help me to optimize the process. Or point me to another solution of storing large amount of data in iOS.
Here is the part that cost most of the time:
[moc performBlock:^{
for (NSDictionary *dictionary in dataObjectsArray) {
NSPredicate *predicate = [ObjectA predicateWithDictionary:dictionary];
NSFetchRequest *request = [[NSFetchRequest alloc] initWithEntityName:ENTITY_NAME];
request.predicate = predicate;
NSError *error;
NSArray *fetchedObjects = [moc executeFetchRequest:request
error:&error];
ObjectA *objectATemp = (ObjectA *)[fetchedObjects lastObject];
if (!objectATemp) {
NSEntityDescription *entityDescription = [NSEntityDescription entityForName:ENTITY_NAME
inManagedObjectContext:moc];
objectATemp = [[ObjectA alloc] initWithEntity:entityDescription
insertIntoManagedObjectContext:moc];
}
[ObjectA setObjectA:objectATemp
dictionary:dictionary];
// check if user already liked the ObjectA
ObjectB *likedObject = [ObjectB objectBWithId:objectATemp.id];
if (likedObject &&
!objectATemp.user_liked.boolValue) {
[likedObject.managedObjectContext deleteObject:likedObject];
}
}
NSError *error;
if ([moc hasChanges] &&
![moc save:&error]) {
NSLog(@"%@", error);
}
// saving Context
NSManagedObjectContext *managedObjectContext = [self newManagedObjectContext];
[managedObjectContext performBlock:^{
NSError *error;
if ([managedObjectContext hasChanges] &&
![managedObjectContext save:&error]) {
NSLog(@"%@", error);
}
if (completionHandler) {
completionHandler();
}
}];
}];
Any advise is pleased.
Upvotes: 0
Views: 1519
Reputation: 70936
Storing a lot of data with Core Data isn't a problem; 2000 records isn't even "a lot". Saving a lot of new data every time the user taps a button is going to be slow. The best solution would be to not need to store all of this data every time.
But there are also some significant inefficiencies in your code.
You do a fetch request at every pass through the loop. Assuming that your dataObjectsArray
contains 2000 objects, that's 2000 fetches. That's easily the least efficient way to fetch the data. You would get major improvements if you could do a single fetch, or maybe fetch 100 or 200 objects at a time instead of just one for each of the 2000 passes. It's hard to tell what the best way would be to do this, since you didn't describe your predicates or other methods that you code calls, but this would be the first thing to do, that will get the biggest improvement. If you can't get your data in smaller chunks, you can at least process it in larger ones. You might need to make changes to methods like setObjectA:dictionary:
, objectBWithID:
, etc.
Also, some of your objects are the same every time the loop runs. If an object will be the same every time, create it once, not (potentially) 2000 times. For example, entityDescription
. That's much less of a problem than the thousands of fetch requests, but it should provide some small improvement.
Upvotes: 3
Reputation: 213
If your user will not see all 2000 at the same time, you can try do download 100 or 200 entities in the first time and show them to your user and download the rest of the entities in background. The other way is use pagination in the backend as you said.
Upvotes: 0