Reputation: 47709
I'm running a process that does a lot of updates (> 100,000) to a table. I have the choice between putting all the updates in a single transaction or committing transactions every 1000 or so.
Ignore for the moment the case where a transaction fails and is aborted. I'm interested in the best size of transaction for memory and speed efficiency.
Upvotes: 4
Views: 1002
Reputation: 300749
Ignoring the case of a transaction failing, splitting up into batches will use less memory.
It is conceivable that it might add some overhead to the total time taken to perform the entire update, but put less overall pressure on anything else running concurrently.
Upvotes: 2