mahan07
mahan07

Reputation: 917

Delete data from table efficiently

We are using a tool in which we need to delete log tables.Right now,we are keeping 1 million rows and deleting rest of rows if any.This is time consuming and sometimes in production takes 12 hours to delete the data,which affects daily transactions.Is there any other way to delete log tables efficiently,without affecting daily transactions.

Lets suppose we want to keep 1 million rows:

Select query:

Select Query:select min(date) from (select date from table order by date desc) where rownum <= 1000000

Delete Query:

Delete Query:Delete from table where date > (result of select query)

Is there any way we can optimize these two queries?

Upvotes: 0

Views: 132

Answers (2)

Ross Bush
Ross Bush

Reputation: 15175

Inserts into a smaller table are much faster than deletions from a larger table. In this case you can insert the records you want to keep into a staging table. If logging is not a concern and referential integrity will allow it, you could simply:

  1. Create a staging table that is an exact copy of your log table.
  2. Insert the records you intend to keep from your log table into your staging table.
  3. Drop your log table.
  4. Rename your staging table to log.

Upvotes: 3

benjamin moskovits
benjamin moskovits

Reputation: 5458

You want to delete ten thousand rows at a time.

delete top (10000) from tableA where condition1 <> xyz
while (@@rowcount >0)
begin
     delete top (10000) from tableA where condition1 <> xyz

end
delete from tablea where condition1 <> xyz

This way you wont have a large transaction log. You may want to experiment with the number of rows (some people go with 1000 rows) but its very dependent on the amount of activity in your machine and the speed of your drives and the placement and configuration of your log files. You said you want to keep certain rows so I added condition1 <> xyz

Upvotes: 1

Related Questions