xiaodai
xiaodai

Reputation: 16004

Error when inserting large CSV into empty tables via Teradata SQL assistant

I have a 6 gig csv and I am trying to load it into Teradata.

So I fire up Teradata SQL assistant, created an empty table and then I turn on Import data mode and try to insert the records into the empty table using

insert into some_lib.some_table
(?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?);

But I always get a failure message around the 600k rows marks that goes

Error reading import file at record 614770: Exception of type 'System.OutOfMemoryException' was thrown.

I think it's because Teradata SQL assistant is trying to load everything into memory on my 4G laptop before trying to send the data to the Teradata server. Is my theory correct? How do I tell Teradata to upload the data in chunks and not try to store everything in local memory?

Upvotes: 0

Views: 2283

Answers (1)

Rob Paller
Rob Paller

Reputation: 7786

I believe you are pushing the capabilities of SQL Assistant as a means to load data.

Have you considered installed the Teradata Load Utilities such as FastLoad or MultiLoad on your system?

Another option if you don't want to write scripts for the load utilities would be to install Teradata Studio Express which should provide a mechanism to use JDBC FastLoad to load your data. This would be in the Smart Loader mechanism of Studio Express. You may find this to be more extensible than SQL Assistant using .Net or ODBC.

Upvotes: 1

Related Questions