vlyalcin
vlyalcin

Reputation: 561

What is the max JDBC batch size?

I have a list and that list increasing continuously. I am doing add batch depend on the list size. I forgot to put limit for do executeBatch in specified size.

Program is working for hours. I dont want to stop, fix and start again for now.

My questions, what decides size of the adding batch? What is the max capacity of the batch to do executeBatch() in a one time? How many time I can use addBatch without do executeBatch()?

Upvotes: 16

Views: 40790

Answers (3)

Craig Ringer
Craig Ringer

Reputation: 324911

PgJDBC has some limitations regarding batches:

The benefit of batching is a reduction in network round trips. So there's much less point if your DB is local to your app server. There's a diminishing return with increasing batch size, because the total time taken in network waits falls off quickly, so it's often not work stressing about trying to make batches as big as possible.

If you're bulk-loading data, seriously consider using the COPY API instead, via PgJDBC's CopyManager, obtained via the PgConnection interface. It lets you stream CSV-like data to the server for rapid bulk-loading with very few client/server round trips. Unfortunately, it's remarkably under-documented - it doesn't appear in the main PgJDBC docs at all, only in the API docs.

Upvotes: 13

Martin
Martin

Reputation: 2552

There may be a maximum number of parameter markers depending on the JDBC implementation.

For instance the PostgreSQL driver represents the number of parameters as a 2-byte integer, which in Java is at most 32768.

Upvotes: 2

asafm
asafm

Reputation: 919

AFAIK there is no limit beside the memory issue. regarding your question: the statement is sent to the DB only on execute batch so until you'll execute the batch the memory will continue to grow until you will get JavaHeapSpace or the batch will be sent to the DB.

Upvotes: 2

Related Questions