Leptonator
Leptonator

Reputation: 3519

COMMIT Transaction based on SQL SELECT every N rows

Could possibly do this with a CURSOR.. However, we are writing out a file for a rollback on a go-live in a couple weeks.

ANSWER/UPDATE: Wound up using a CURSOR. Probably not the best way to do this, but works.

DECLARE     @updateCount bigint 
DECLARE     @AnId BIGINT
DECLARE     @aField NVARCHAR(1000)
DECLARE     @NxtID BIGINT

    -- Initialize the update count
    set @updateCount = 0

    DECLARE updatecursor CURSOR FOR
        SELECT id,
         field,
         ORG_ID
            FROM some_table
            WHERE Another_ID IN(SELECT Another_ID FROM Another_Table WHERE NAME='Business_Group') 
    OPEN updatecursor
    FETCH NEXT FROM updatecursor INTO @AnId, @aField,@NxtID
    WHILE @@fetch_status = 0
    BEGIN
        -- Begin transaction for the first record
        if @updateCount = 0
        BEGIN 
            PRINT 'BEGIN TRANSACTION'
            BEGIN TRANSACTION
        END

        PRINT 'UPDATE some_table SET Field='+Char(39)+@aField+Char(39)+' WHERE ID='+CONVERT(VARCHAR,@AnId)+' and Another_ID='+CONVERT(NVARCHAR,@NxtID)+''

        set @updateCount = @updateCount + 1
        -- Commit every 1000 records and start a new transaction
        IF @updateCount % 1000 = 0 
        BEGIN
            PRINT 'COMMIT TRANSACTION'
            PRINT 'WAITFOR DELAY ''00:00:01'''
            -- DONE WITH THE PREVIOUS GROUP, WE NEED THE NEXT
            PRINT 'BEGIN TRANSACTION'
            COMMIT TRANSACTION
            BEGIN TRANSACTION
        END
        FETCH NEXT FROM updatecursor INTO @AnId,@aField,@NxtID
    END
    IF @updateCount % 1000 <> 0
    BEGIN
        -- COMMIT FINAL WHEN TO THE END
        PRINT 'COMMIT TRANSACTION'
        PRINT 'WAITFOR DELAY ''00:00:01'''
        COMMIT TRANSACTION
    END
    CLOSE updatecursor
    DEALLOCATE updatecursor
go

What we would like to do is to write in the rollback SQL Script every 1000 or 10000 rows a COMMIT.. The select works fine and creates the UPDATE ok as well.

However, we have about 200k records to UPDATE. This is a rollback script, however when testing the rollback this morning SQL Server did not like the huge amount of records.

Have seen this: http://www.sqlusa.com/bestpractices2005/hugeupdate/ and would be great if all of the records were the same, but they are not.

Because of our limited permissions set Corporate does not allow the use of OPENROWSET or related functions..

Here is what we have so far:

-- 202,739
DECLARE @UPDATECOUNT INT
SET @UPDATECOUNT=0

SELECT 
    'UPDATE some_table SET field='+original_field+' WHERE ID='+ID+'' 
 as 'BackupData'
FROM some_table (nolock)  
WHERE Another_ID IN(SELECT Another_ID FROM Another_Table WHERE NAME='Business-group') 

        set @UPDATECOUNT = @UPDATECOUNT + 1
        -- Commit every 10000 records and start a new transaction
        IF @UPDATECOUNT % 10000 = 0 
        BEGIN
            COMMIT TRANSACTION
            BEGIN TRANSACTION
        END

Thanks!

Upvotes: 0

Views: 7191

Answers (1)

StackOverflowGuest
StackOverflowGuest

Reputation: 41

Create a temp table to store ID's you've update. Update the TOP X records until the condition is no longer met (excluding IDs that you've already updated).

Something like this:

CREATE TABLE #updated ( ID INT )

WHILE EXISTS ( SELECT 1 FROM sometable WHERE originalwhereclause = 1 )
BEGIN
    BEGIN TRANSACTION
    UPDATE TOP (10000) some_table
        SET field = value 
    OUTPUT Inserted.id INTO #updated
    FROM some_table
    WHERE originalwhereclause = 1
        NOT EXISTS (SELECT 1 FROM #updated WHERE id = sometable.id)
    COMMIT TRANSACTION
END

Upvotes: 1

Related Questions