Reputation: 5542
Official documentation says:
Batch writing can improve database performance by sending groups of INSERT, UPDATE, and DELETE statements to the database in a single transaction, rather than individually
(emphasis is mine).
But if an entity has Optimistic Locking
@Version field, then all UPDATEs are executed independently.
To prove this, here is the source code snippet DatabaseAccessor:566
:
if (/* ommited */(!dbCall.hasOptimisticLock() || getPlatform().canBatchWriteWithOptimisticLocking(dbCall) ) /* ommited */) {
// this will handle executing batched statements, or switching mechanisms if required
getActiveBatchWritingMechanism().appendCall(session, dbCall);
//bug 4241441: passing 1 back to avoid optimistic lock exceptions since there
// is no way to know if it succeeded on the DB at this point.
return Integer.valueOf(1);
}
So, basically, what above snippet means, is that if Entity has an optimistic lock, then batch update will be ignored.
Is there a workaround for that? I still want to use JPA.
UPDATE:
It turned out that I needed to add this property to persistence.xml
in order to enable batch update with optimistic locking:
<property name="eclipselink.target-database" value="org.eclipse.persistence.platform.database.oracle.Oracle11Platform"/>
Note, that Oracle10Platform or higher could be used as a value. Lower versions don't support this feature.
Also, to enable batch writing, you have to add at least one property in your persistence.xml:
<property name="eclipselink.jdbc.batch-writing" value="JDBC" />
You can also, optionally configure batch size:
<property name="eclipselink.jdbc.batch-writing.size" value="1000" />
Upvotes: 1
Views: 1850
Reputation: 21145
Did you check the canBatchWriteWithOptimisticLocking() method for the platform class you are using? This call is there so that if your driver can support returning the row counts for individual calls within the batch so that Eclipselink can throw an optimistic lock exception as required, batching can be used.
Upvotes: 1