Reputation: 35953
We want to remove tons of obsolete data in a table, but this will lock table for a long while. Is is possible to use Cursor to delete, says, one hundred records per transaction in a while-loop ?
And where can I refer to the example?
Upvotes: 5
Views: 14221
Reputation: 15849
Something like this:
DECLARE @stillgoing bit;
SET @stillgoing = 1;
WHILE @stillgoing = 1
BEGIN
DELETE TOP (100) YourTableName
WHERE IsObsolete = 1;
IF @@ROWCOUNT = 0
SET @stillgoing = 0;
CHECKPOINT /* Will encourage the log to clear if it's in Simple recovery model */
END
Edit: This will only work in SQL 2005 and on. As we've just learned it's SQL 2000, this code instead:
DECLARE @stillgoing bit
SET @stillgoing = 1
SET ROWCOUNT 100
WHILE @stillgoing = 1
BEGIN
DELETE YourTableName
WHERE IsObsolete = 1
IF @@ROWCOUNT = 0
SET @stillgoing = 0
CHECKPOINT /* Will encourage the log to clear if it's in Simple recovery model */
END
And... Simple Recovery Model means that the log will truncate on checkpoints, rather than only when the log is backed up.
Upvotes: 3
Reputation: 86892
Declare MyPrimaryKey [SomeType]
Declare @MyCursor Cursor For
Select MyPrimayKey from MyTable
Open @MyCursor
Fetch Next From @MyCursor
Into
@MyPrimaryKey
WHILE @@FETCH_STATUS = 0
BEGIN
WaitFor Delay '00:00:05'
Begin Transaction
DELETE From MyTable where MyPrimaryKey = @MyPrimaryKey
Commit Transaction
Fetch Next From @MyCursor
Into
@MyPrimaryKey
END
Upvotes: 1
Reputation: 630599
You can commit every 100 (or 1000, or whatever) records in the loop, releasing the lock and letting any pending operations get their work in. Otherwise you're generating a huge transaction log which eats greater than O(n^2) time when it gets large. This thing will be a large portion of the actual time spent, instead of getting the actual deleting done. If you batch and commit, you're safer, rollback/log files won't grow like crazy, and locks are manageable.
If however you need that huge rollback as an option, there's 2 choices:
Upvotes: 1