Reputation: 3841
I am wondering if it a) makes sense and b) is possible to use different create batch sizes for different entities?
Example:
A
with millions or rows, but only a few small columns. For this table I could set the batch value to about 100 or even 1000.B
contains a column with some thousand bytes of data which are filled on insert. For this table I thinks I should only use a batch value between 1 and 10.Maybe I can use an own implementation of org.hibernate.engine.jdbc.batch.spi.BatchBuilder
where I set the batch size in dependence of the entity to be inserted? (By setting the hibernate property hibernate.jdbc.batch.builder
.)
Does that sound logical?
Upvotes: 2
Views: 875
Reputation: 9102
Your approach seems to be correct. The default implementation BatchBuilder
is BatchBuilderImpl
and within the buildBatch
method one of the parameters passed is BatchKey
. This is of format (fully qualified entity class name)#(type of operation>
E.g., com.test.A#INSERT.
public class BatchBuilderImpl implements BatchBuilder, Configurable, Manageable, BatchBuilderMXBean { .......
@Override
public Batch buildBatch(BatchKey key, JdbcCoordinator jdbcCoordinator) {
final Integer sessionJdbcBatchSize = jdbcCoordinator.getJdbcSessionOwner()
.getJdbcBatchSize();
final int jdbcBatchSizeToUse = sessionJdbcBatchSize == null ?
this.jdbcBatchSize :
sessionJdbcBatchSize;
return jdbcBatchSizeToUse > 1
? new BatchingBatch( key, jdbcCoordinator, jdbcBatchSizeToUse )
: new NonBatchingBatch( key, jdbcCoordinator );
}
}
Within your customized implementation you can set the batch size by loading configurations from external file containing the entity class and corresponding batch sizes. The batch size would then depend upon the first portion of BatchKey.
Upvotes: 2