Reputation: 355
I am using jdbc
and uploading data to Teradata. I used to have 100,000 rows of batch previously and it ALWAYS worked fine for me. No dataset failed uploading EVER !
Now, I tried to upload a one column table (all integers)
, I get this Too many data records packed in one USING row
? As I changed the batch to 16,383
it worked .
I found out that I am still able to use 100,000
rows batch for tables with multiple columns
, however when I try to upload a table with a single column
, it throws Too many data records packed in one USING row
. . . I just can't understand why ? Intuitively , a single column table should be easier to upload right ? What is going on here ?
Upvotes: 0
Views: 1399
Reputation: 7786
16383 is the limit for a PreparedStatement using a non-FastLoad INSERT for Teradata JDBC.
Have you considered adding TYPE=FASTLOAD to your connection parameters and allowing Teradata to invoke the FastLoad API to bulk load your data for INSERT statements that are supported by FastLoad? The JDBC FastLoad mechanism is suggested for inserts of 100K records or more. The big factor here is that your target table in Teradata must be empty.
If it isn't empty then you may be able to load an empty stage table that you in turn use the ANSI MERGE operator to perform an UPSERT of the stage data to the target table.
Upvotes: 2