Reputation: 101
I'm trying to update almost 800 millions of rows and searching the best approach. I already tried to use bulk collection and update, but take to long. I commit after every 5000 records. For 130 min. I have only 0.25% ready, so I am searching another way. The structure of table is
CAMP_TASK NUMBER
CODE_SYSTEM VARCHAR2(10 CHAR)
ID_SOURCE VARCHAR2(40 CHAR)
ID_SOURCE_PARENT VARCHAR2(40 CHAR)
DATE_EFFECTIVE DATE
PROC_INSERTED NUMBER
PROC_UPDATED NUMBER
CAMP_TASK_PARENT NUMBER
and I have to update camp_task_parent
field with camp_task value if id_source_parent is not null. So in other words if a have id_source_parent I have find his corresponding cam_task and update camp_task_parent with it for the current rowed.
Thanks!
Upvotes: 2
Views: 1708
Reputation: 3985
the fastest way would be to
1) create new empty table with the same structure
create table new_table as select * from old_table where 1=2;
2) disable logging for that table so that you won't run out of recovery area space
alter table new_table nologging;
3) populate new table with updated records, bypassing constraints, triggers and the buffer cache
insert /*+ APPEND */ into new_table (select <update column expressions> from old_table);
4) double-check data is migrated correctly
5) clean up
drop old_table;
alter table new_table logging;
rename new_table to old_table;
6) re-create indexes
7) backup database
Upvotes: 7