Ianthe
Ianthe

Reputation: 5709

Oracle : Inserting large dataset into a table

I need to load data from different remote database into our own database. I write a single "complex" query using WITH statement. It is around 18 Million rows of data.

What is most efficient way to do the insert?

  1. using cursor insert one by one
  2. using INSERT INTO

or is there any other way?

Upvotes: 0

Views: 1453

Answers (4)

James Moorebank
James Moorebank

Reputation: 41

You could use Data Synchronisation Studio and change the select statement to take 1 million at a time (I think 18m at once would probably overload your machine)

Upvotes: 0

Marcin Wroblewski
Marcin Wroblewski

Reputation: 3571

As Justin wrote, the most efficient approach is to use a single SQL statement ( insert into ... select ... ). Additionally you can take advantage of direct-path insert

Upvotes: 1

someuser2
someuser2

Reputation: 1

18 million rows will require quite a bit of rollback for your single insert stmt scenario. A cursor for loop would be much slower but you'd be able to commit every x rows.

Personally, I'd go old school and dump out to a file and load via sqlldr or data pump, esp as this is across databases.

Upvotes: 0

Justin Cave
Justin Cave

Reputation: 231661

The fastest way to do anything should be to use a single SQL statement. The next most efficient approach is to use a cursor doing BULK COLLECT operations to minimize context shifts between the SQL and PL/SQL engines. The least efficient approach is to use a cursor and process the data row-by-row.

Upvotes: 4

Related Questions