Reputation: 45
I have a database with a very large table. (150 GB) The major content for the table is a PDF file in a varbinary(max) column.
Now, we have finally moved most of the PDF files away from the DB. So my task is now to update the table and set the PDF to NULL in 95% of the 1 million rows.
I use this approach to update the table.
Set the recovery model to simple.
Chunks of 5000 rows a time is updated. The RAISERROR is just for me to see that something is happening because a PRINT statement is just buffered and only shown when the entire statement is complete.
My question is as follow:
Is it okay for my to shrink the database afterwards?
I know about the index fragmentation that this will cause, so if I run an Index optimizer script after, the DB shrink in this case, is fine to do ;) ?
DECLARE @msg AS nvarchar(200)
DECLARE @Stopper AS int = 1
DECLARE @IdRange AS int = 0
DECLARE @MaxId AS int = (SELECT MAX(ID) FROM EmailLog)
DECLARE @COUNT AS int = 5000
SET @IdRange = @COUNT
WHILE (@Stopper = 1)
BEGIN
UPDATE dbo.EmailLog
SET
PdfFile = NULL
WHERE InvoiceNo > 0 AND
Id BETWEEN @IdRange - @COUNT AND @IdRange
SET @IdRange += @COUNT
IF (@MaxId < @IdRange + @COUNT)
BEGIN
SET @Stopper = 0;
END
ELSE
BEGIN
SET @msg = 'Max @IdRange is now:' + CAST(@IdRange AS nvarchar(20))
RAISERROR(@msg, 0, 1) WITH NOWAIT
END
END
Upvotes: 2
Views: 117
Reputation: 5694
Yes, if you removed a lot of data and you do not expect the database to grow back at this size for a long time, it would be fine to shrink the database, as long as you reindex (or defragment the indexes) after shrinking.
Upvotes: 1