jeni
jeni

Reputation: 983

How to copy large amount of data from one table to other table in SQL Server

I want to copy large amount of datas from one table to another table.I used cursors in Stored Procedure to do the same.But it is working only for tables with less records.If the tables contain more records it is executing for long time and hanged.Please give some suggestion as how can i copy the datas in faster way,My SP is as below:

--exec uds_shop

--select * from CMA_UDS.dbo.Dim_Shop
--select * from UDS.dbo.Dim_Shop
--delete from CMA_UDS.dbo.Dim_Shop

alter procedure uds_shop
as
begin
declare @dwkeyshop int
declare @shopdb int
declare @shopid int
declare @shopname nvarchar(60)
declare @shoptrade int
declare @dwkeytradecat int
declare @recordowner nvarchar(20)
declare @LogMessage varchar(600)

Exec CreateLog 'Starting Process', 1 

DECLARE cur_shop CURSOR FOR
    select 
    DW_Key_Shop,Shop_ID,Shop_Name,Trade_Sub_Category_Code,DW_Key_Source_DB,DW_Key_Trade_Category,Record_Owner
    from 
    UDS.dbo.Dim_Shop 

    OPEN  cur_shop
    FETCH NEXT FROM cur_shop INTO @dwkeyshop,@shopid,@shopname,@shoptrade, @shopdb ,@dwkeytradecat,@recordowner
    WHILE @@FETCH_STATUS = 0 
    BEGIN 
    Set @LogMessage = ''
    Set @LogMessage = 'Records insertion/updation start date and time : ''' + Convert(varchar(19), GetDate()) + ''''            

    if (isnull(@dwkeyshop, '') <> '')
        begin 
        if not exists (select crmshop.DW_Key_Shop from CMA_UDS.dbo.Dim_Shop as crmshop where (convert(varchar,crmshop.DW_Key_Shop)+CONVERT(varchar,crmshop.DW_Key_Source_DB)) = convert(varchar,(CONVERT(varchar, @dwkeyshop) + CONVERT(varchar, @shopdb))) )
        begin
        Set @LogMessage = Ltrim(Rtrim(@LogMessage)) + ' ' + 'Record for shop table is inserting...'         
                insert into 
                 CMA_UDS.dbo.Dim_Shop
                (DW_Key_Shop,DW_Key_Source_DB,DW_Key_Trade_Category,Record_Owner,Shop_ID,Shop_Name,Trade_Sub_Category_Code)
                values
                (@dwkeyshop,@shopdb,@dwkeytradecat,@recordowner,@shopid,@shopname,@shoptrade)
        Set @LogMessage = Ltrim(Rtrim(@LogMessage)) + ' ' + 'Record successfully inserted in shop table for shop Id : ' + Convert(varchar, @shopid) 

            end
            else

            begin 
             Set @LogMessage = Ltrim(Rtrim(@LogMessage)) + ' ' + 'Record for Shop table is updating...' 
             update 
             CMA_UDS.dbo.Dim_Shop
             set DW_Key_Trade_Category=@dwkeytradecat,
             Record_Owner=@recordowner,
             Shop_ID=@shopid,Shop_Name=@shopname,Trade_Sub_Category_Code=@shoptrade

             where       
                DW_Key_Shop=@dwkeyshop and DW_Key_Source_DB=@shopdb     
             Set @LogMessage = Ltrim(Rtrim(@LogMessage)) + ' ' + 'Record successfully updated for shop Id : ' + Convert(varchar, @shopid)
            end
        end
        Exec CreateLog @LogMessage, 0
        FETCH NEXT FROM cur_shop INTO @dwkeyshop,@shopid,@shopname,@shoptrade, @shopdb ,@dwkeytradecat,@recordowner
    end
    CLOSE cur_shop 
    DEALLOCATE cur_shop
End

Upvotes: 2

Views: 5109

Answers (4)

Frank Thomas
Frank Thomas

Reputation: 2514

Cursors are notoriously slow and ram can begin to become a problem for very large datasets.

It does look like you are doing a good bit of logging in each iteration, so you may be stuck with the cursor, but I would instead look for a way to break the job up into multiple invocations so that you can keep your footprint small.

If you have an autonumber column, I would add a '@startIdx bigint' to the procedure, and redefine your cursor statement to take the 'TOP 1000' 'WHERE [autonumberFeild] <= @startIdx Order by [autonumberFeild]'. Then create a new stored procedure with something like:

DECLARE @startIdx bigint = 0
WHILE select COUNT(*) FROM <sourceTable> > @startIdx
BEGIN
    EXEC <your stored procedure> @startIdx
END
SET @startIdx = @startIdx + 1000

Also, make sure your database files are set to auto-grow, and that it does so in large increments, so you are not spending all your time growing your datafiles.

Upvotes: 0

priya
priya

Reputation: 9

INSERT INTO targetTable 
SELECT * FROM destinationTable 
WHERE someCriteria (based on Criteria you can copy/move the records)

Upvotes: 0

Mitch Wheat
Mitch Wheat

Reputation: 300549

Assuming targetTable and destinationTable have the same schema...

INSERT INTO targetTable t
SELECT * FROM destinationTable d
WHERE someCriteria

Avoid the use of cursors unless there is no other way (rare).

You can use the WHERE clause to filter out any duplicate records.

If you have an identity column, use an explicit column list that doesn't contain the identity column.

You can also try disabling constraints and removing indexes provided you replace them (and make sure the constraints are checked) afterwards.

If you are on SQL Server 2008 (onwards) you can use the MERGE statement.

Upvotes: 5

Chris
Chris

Reputation: 5654

From my personal experience, when you copy the huge data from one table to another (with similar constraints), drop the constraints on the table where the data is getting copied. Once the copy is done, reinstate all the constraints again.

I could reduce the copy time from 7 hours to 30 mins in my case (100 million records with 6 constraints)

Upvotes: 1

Related Questions