user1016594
user1016594

Reputation: 71

Fastest way to insert a million rows in Oracle

How can I insert more than a million rows in Oracle in optimal way for the following procdeure? It hangs if I increase FOR loop to a million rows.

create or replace procedure inst_prc1 as
   xssn number;
   xcount number;
   l_start Number;
   l_end Number;
   cursor c1 is select max(ssn)S1 from dtr_debtors1;

Begin
  l_start := DBMS_UTILITY.GET_TIME;
  FOR I IN 1..10000 LOOP
    For C1_REC IN C1 Loop
      insert into dtr_debtors1(SSN) values (C1_REC.S1+1);
    End loop;
  END LOOP;
  commit;
  l_end := DBMS_UTILITY.GET_TIME;
  DBMS_OUTPUT.PUT_LINE('The Procedure  Start Time is '||l_start);
  DBMS_OUTPUT.PUT_LINE('The Procedure End Time is '||l_end); 
End inst_prc1;

Upvotes: 5

Views: 39423

Answers (4)

Nvr
Nvr

Reputation: 171

1) If you want to insert using PL/SQL, then use BULK COLLECT INTO and for insert DML use BULK BIND FOR ALL.

2) In SQL multi insert use INSERT ALL statement.

3) Another method INSERT INTO <tb_nm> SELECT.

4) Use SQL LOADER Utility.

Upvotes: 0

Rahul Tripathi
Rahul Tripathi

Reputation: 172378

Try to drop all the index created on your table and then try to insert using the select query. You can try this link which will help you in inserting millions of rows fast into your database.

Upvotes: 0

Lokesh
Lokesh

Reputation: 7940

Your approach will lead to memory issues. Fastest way will be this [Query edited after David's comment to take care of null scenario] :

insert into dtr_debtors1(SSN)
select a.S1+level
   from dual,(select nvl(max(ssn),0) S1 from dtr_debtors1) a
connect by level <= 10000 

A select insert is the fastest approach as everything stays in RAM. This query can become slow if it slips into Global temp area but then that needs DB tuning . I don't think there can be anything faster than this.

Few more details on memory use by Query:

Each query will have its own PGA [Program global area] which is basically RAM available to each query. If this this area is not sufficient to return query results then SQL engine starts using Golabl temp tablespace which is like hard disk and query starts becoming slow. If data needed by query is so huge that even temp area is not sufficient then you will tablespace error.

So always design query so that it stays in PGA else its a Red flag.

Upvotes: 6

ntalbs
ntalbs

Reputation: 29438

Inserting one row at a time with single insert statement within loop is slow. The fastest way is to use insert-select like the following, which generates a million rows and bulk insert.

insert into dtr_debtors1(SSN)
select level from dual connect by level <= 1000000;

Upvotes: 2

Related Questions