Reputation: 679
I want to update a column value in a table containing 800k rows. So, I have created a simple Java application with Hibernate 4.3.6.Final as an ORM framework.
I have configured the JDBC batch with 45 as a value and I have disabled the use of second level cache.
<property name="hibernate.jdbc.batch_size">45</property>
<property name="hibernate.cache.use_second_level_cache">false</property>
Can I increase the batch_size
for example to 200? Because in hibernate docs they mention:
Before batch processing, enable JDBC batching. To enable JDBC batching, set the property hibernate.jdbc.batch_size to an integer between 10 and 50.
This is the code (simplified):
session.beginTransaction();
List<MyEntity> entities = findAllEntities();
logger.info("Number of fetched rows: " + entities.size());
int count = 0;
for (MyEntity entity : entities) {
// change some fields of the entity
session.update(entity);
if ( ++count % HIBERNATE_BACH_SIZE == 0 ) {
//flush a batch of updates and release memory:
session.flush();
session.clear();
}
}
session.getTransaction().commit();
Upvotes: 0
Views: 616
Reputation: 5563
Fetching all the entities and then looping through them one by one is quite tedious and will always result in suboptimal performance.
Since it seems that you perform an unconditional update (i.e no checks are in place to define which object will get its fields updated) you should use a simple HQL
query to perform the update in a single action.
For example, given the table you want to update is MyEntity
then your query would looke like this:
int rows = session.createQuery("UPDATE MyEntity ME SET me.myField1=:newField1, me.myField2=:newField2)
.setString("newField1", "Something")
.setString("newField2", "Something")
.executeUpdate();
Using that should improve performance a lot.
Upvotes: 2