Reputation: 2321
Which is the best way to delete large records from a table ? I have requirement in which I need to delete some 30 Million records from a table on a weekly basis. Problem here is that, there is an UNDO retention issue. I am planning to break my delete statements in to chunks so that it will help.
Current Statement
DELETE FROM LARGE_TABLE WHERE LARGE_ID ='someValue';
Details of Table :
1.Table will have around 70 Million records.
2.Delete will affect at least 40-45% table.
3.Table is not partitioned.
Planning to change the statement to
LOOP
DELETE FROM LARGE_TABLE WHERE LARGE_ID ='someValue' and ROWNUM <'some row number';
COMMIT;
select count(1) INTO nCountValue from LARGE_TABLE WHERELARGE_ID ='someValue' ;
EXIT WHEN nCountValue = 0;
END LOOP;
Main idea is that this will reduce the UNDO retention.
Question, Is it the best practice to do ? If not, any pointers ? Please help
Upvotes: 0
Views: 2794
Reputation: 1
TO DROP TABLE PERMANENTLY, IF YOUR DATA BASE IS 10G use purge
drop table LARGE_TABLE_OLD purge;
Because, from 10g dropped table can be replaced, so therefore to delete it permanently, you need to use purge.
Upvotes: 0
Reputation: 17643
In this scenario, I will
CREATE TABLE LARGE_TABLE_NEW AS
SELECT * FROM LARGE_TABLE
WHERE LARGE_ID <> 'someValue';
rename LARGE_TABLE to LARGE_TABLE_old;
Rename LARGE_TABLE_new to LARGE_TABLE;
Verify anything is ok and then;
drop table LARGE_TABLE_OLD;
recreate indexes, constraints (better you save them before drop);
This will be faster because insert is faster than delete. More faster.
Upvotes: 2