Reputation: 39
Lets say I need to create 100.000 records and than .save()
them. All records should be saved together and only one transaction is allowed. Clearly, this is a problem and program can run out of memory. But what will happen if I will do the same work in the same transaction but in chunks? 1000 records per iteration and than I save them. Will it fix the problem? Intuitively I think it will not and design is wrong. What happens with entities after I .save()
? Does JPA still keep references to all saved entities? Where can I read more on this topic? Thanks.
Upvotes: 0
Views: 501
Reputation: 10716
But what will happen if I will do the same work in the same transaction but in chunks? 1000 records per iteration and than I save them. Will it fix the problem?
It will fix the problem if you use the following approach:
while (hasNextBatch()) {
saveBatch();
entityManager.flush();
entityManager.clear();
}
What happens with entities after I .save()? Does JPA still keep references to all saved entities?
Yes (unless you do the above).
Where can I read more on this topic?
Here, for example.
Upvotes: 1