Reputation: 34424
As stated in flush docs java
Flushing is the process of synchronizing the underlying persistent store with persistable state held in memory.
Here is my understanding of above statement
So if somebody is doing insert/update and then flush, that extra inserted rows will be lying in java memory only. But other way around i.e Db data will only be synchronized with persistable state held in memory on commit only.
Now lets go by above understanding
I came across HIbernate commit() and flush() where accepeted answer tells session.flush helps in releasing memory in some cases and hence avoid OutOfMemoryException.
when i do below , line1 (session.flush()) will execute the insert query for 20 customers on customer table which release the memory for 20 customer objects in list but on the other hand creates 20 customer data rows under customer table which are still in java memory(it will go to database only on commit at line2). So i am not sure how session.flush helps here in releasing the memory ?
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
for ( int i=0; i<100000; i++ ) {
Customer customer = new Customer(.....);
session.save(customer);
if ( i % 20 == 0 ) { //20, same as the JDBC batch size
//flush a batch of inserts and release memory:
session.flush(); //line1
session.clear();
}
}
tx.commit();// line2
session.close();
Upvotes: 3
Views: 3854
Reputation: 21445
As per API session.flush()
void flush()
throws HibernateException
Force this session to flush. Must be called at the end of a unit of work, before committing the transaction and closing the session (depending on flush-mode, Transaction.commit() calls this method).
Flushing is the process of synchronizing the underlying persistent store with persistable state held in memory.
So purpose of this method is to synchronize your persistent state with underlying database. It won't help in freeing up of memory if you still have references to objects.
Also please refer to this link - Hibernate out of memory exception while processing large collection of elements
Where as in your program you are creating 100000
Customer
inside for-loop
. If there is no hibernate code then it means you are not maintaining any references of these objects so they are eligible for garbage collection.
But when you say session.save(customer)
, the objects are associated with Hibernate's session so they will be placed in first level cache.
If the number of objects increases and sufficient memory is not available then you will get out of memory issues as hibernate tries to maintain all these objects in memory. So calling flush
method will make hibernate to call required queries to database and calling clear
will help it to release the memory in first level cache thus freeing up of some memory.
Update:
Calling session.clear()
makes hibernate to clear memory that it internally maintains in the form of first level cache. session.clear()
makes a call to StatefulPersistenceContext.clear()
where it calls clear()
method on various Map's and other objects. This clears memory in first level cache. This clearly indicates that objects are eligible for garbage collection that helps to free up some memory. So the state is not maintained by Hibernate anymore so objects are in detached state.
Now as per API Traansaction.commit();:
void commit()
throws HibernateException
Flush the associated Session and end the unit of work (unless we are in FlushMode.MANUAL.
This method will commit the underlying transaction if and only if the underlying transaction was initiated by this object.
Calling commit()
flushes any pending items and then issues commit
to the underlying database.
Update:
Also refer to this link - Flushing the session, where it clearly says that JDBC calls are made when we call the flush()
method.
Sometimes the Session will execute the SQL statements needed to synchronize the JDBC connection's state with the state of objects held in memory. This process, called flush.
Except when you explicitly flush(), there are absolutely no guarantees about when the Session executes the JDBC calls, only the order in which they are executed.
Upvotes: 4
Reputation: 3043
flush() will send the data to the database. You will see that if you activate SQL logging in Hibernate. commit() will commit the changes sent to the database previously. It might be, that your cache keeps all or some of that data. In that case you have to tweak your cache setup.
That said, if you have so many changed or new objects in memory that you need to do a flush to to avoid OutOfmemoryError, chances are that you need to change your design on a deeper level, like making multiple but smaller transactions. Having transactions this big can give you problems both in your application and in the database.
Upvotes: 0