Reputation: 5097
I am having some doubts regarding an function that updates multiple entities, but it does one by one. Of course, there could be a latency problem if we are working with a remote DB, but aside from that, I am worry that we can get an OutOfMemoryException, because of the amount of entities we are updating in one single transaction. My code goes something like the one below.
EntityHeader entityHeader = entityHeaderService.findById(id);
for(EntityDetail entityDetail : entityHeader.getDetails()) {
for(Entity entity : entityDetail.getEntities()) {
entity.setState(true);
entityService.update(entity);
}
}
This is an example and we also have a similar case in another method, but with inserts instead. These methods can update or insert up to 2k or 3k entities in one transaction. So my question is, should we start using batch operations or the amount of entities is not big enough to worry about it? Also, would it performed better if done with a batch operation?
Upvotes: 0
Views: 92
Reputation: 4542
When optimizing things always ask yourself if it is worth the time, e.g. :
Anyway ~3k entities in one transaction doesn't sound bad, but there are benefits to jdbc batch even with those numbers (also it is quite easy to achieve).
Kinda hard to tell when you should worry about an OutOfMemoryException
as it depends on how much memory you are giving to the jvm and how big are those entites you are updating; just to give you some number i personally had some memory trouble when i had to insert around between 10~100 thousand rows in the same transaction with 4Gb memory, i had to flush and empty hibernate cache every once in a while.
Upvotes: 1