Reputation: 139
Daily, I currently have to:
There are approx 3000 rows of data
I have, with the help of stackoverflow and my trial and error, created VBA which:
It works, but it's painfully slow because it loops through 3000 or so rows. Because the teradata database is accessed by a large number of people at all times, everytime an error occurrs, my insert statements have to go to the back of the queue, adding to the time taken to complete the procedure.
Is it possible to create VBA that will batch load the 50 files rather than my current method to create 3000 insert statements and loop? I have read about a process that imports into Teradata by using 'Jet Provider', but I know very little about this and my efforts so far have failed.
Anyone know about this method or any other quicker method? I want to avoid BTEQ, MLOAD and can't FLOAD (because tables I load into are not empty). Thanks everyone.
Upvotes: 0
Views: 2135
Reputation: 4262
Teradata is easily capable of handling massive inserts, no matter how many users. However, Teradata is very bad at small inserts. You need to bundle them into large volumes that are loaded in one go. There is tooling for that delivered with Teradata (mload). Then you get top performance.
As an alternative, maybe less speedy, make sure that you are using bind variables and not explicit values in your SQL. Parsing can take a lot of time. You might want to consider tools such as Kettle - ETL tool or Invantive Control - Excel add-in (warning, I work there) to load from Excel into Teradata. They can also load in parallel, improving performance.
Upvotes: 1