Reputation: 5085
I'm working on a JAX-RS web application that is reading the complete folder content into a database. The files in that folder can be very big (+-100Mb). JAXB is used for unmarshalling the xml into Java objects. These objects are persisted into the DB using Hibernate.
To avoid impact on the memory I decided not to keep the content of the complete file in memory, but to process each object individually using streaming.
An additional requirement is that the folder is processed transactionally. So if an error occurs in one of the xmls, the complete folder content is moved to an error folder and the elements that are already added to the database are rollbacked.
Now my question is related to the memory management of hibernate. Since the real commit is done at the very end (after all the elements are persisted to the database using the entity manager), does hibernate keep the data to be really committed in memory the whole time? And if so, do I have any advantage with the streaming of the files in the folder, or is it completely useless since all the elements are kept in memory anyhow by the Spring transactions before committing to the DB?
Upvotes: 3
Views: 1152
Reputation: 43887
If you want to stream this way in Hibernate there are a couple of things you can do.
Transaction tx = session.beginTransaction();
try {
while(...) {
processNextRecord(session);
session.flush();
session.clear();
}
tx.commit();
} catch (Exception ex) {
tx.rollback();
}
For a detailed explanation on all of this an more check out Hibernate's doc on batch processing.
Upvotes: 3