Reputation: 1589
Just wondering whether its a good practice to keep chunk sizes same for all jobs in a Spring Batch application or should we keep it different for different jobs, depending on the specific job's behaviour.
I can understand that obviously the answer depends upon a lot of factors, but just wanted to know whats the standard approach for this, if any.
Thanks
Upvotes: 0
Views: 2483
Reputation: 96
I think understanding the factors to consider when deciding a chunk size will help answer this question. If these factors for example are same for all your jobs (though this is very rare) then why not.
Some books like "Spring Batch in Action" recommend to keep the chunk size commonly between 20 to 200. Some ideas from the same book are as follows:
So yes, the idea is to try running the jobs with different kind of sample data and different chunk sizes and compare for yourself the results and then choose the chunk size.
Upvotes: 2
Reputation: 21453
Chunk size is very specific to the job at hand. It's a key method of optimizing the performance of a batch job and will probably be specific to each job you write. For example, if you have small records, you may be able to have more in a chunk which would optimize the number of writes. Where if you have large records, you may not be able to fit as many in memory between writes.
Upvotes: 3