George2
George2

Reputation: 45761

SQL Server select and insert issue

I have two databases called DB1 and DB2 on the same database server. I have Table1 in DB2 and Table2 in DB2. Currently, I am using insert into select * to transfer all data from Table2 into Table1 (Table1 is empty, my purpose is to make a copy of data from Table2 to Table1). The table structure is clustered ID column (of type GUID) and an XML binary (varbinary) data column.

My current issue is, the memory consumption is very high. Are there any good ideas to reduce the memory consumption? My rough idea is I can initialize a couple of small transactions and selct insert partial data from each transaction.

I am using VSTS 2008 + C# + ADO.Net + SQL Server 2008 Enterprise. Any good solutions or reference samples?

Here is my current code which causes out of memory exception. I am using ADO.Net SQLBulkCopy feature.

namespace BulkCopyTable
{
    public class CopyData
    {
        string _sourceConnectionString;
        string _destinationConnectionString;

        public CopyData(string sourceConnectionString,
                        string destinationConnectionString)
        {
            _sourceConnectionString =
                        sourceConnectionString;
            _destinationConnectionString =
                        destinationConnectionString;
        }

        public void CopyTable(string table)
        {
            using (SqlConnection source =
                    new SqlConnection(_sourceConnectionString))
            {
                string sql = string.Format("SELECT * FROM [{0}]", table);

                SqlCommand command = new SqlCommand(sql, source);

                source.Open();
                IDataReader dr = command.ExecuteReader();

                using (SqlBulkCopy copy =
                        new SqlBulkCopy(_destinationConnectionString))
                {
                    copy.DestinationTableName = table;
                    copy.WriteToServer(dr);
                }
            }
        }
    }

    class Program
    {
        static void Main(string[] args)
        {
            CopyData copier = new CopyData(ConfigurationSettings.AppSettings["source"], ConfigurationSettings.AppSettings["destination"]);
            Console.WriteLine("Begin Copy");
            copier.CopyTable(ConfigurationSettings.AppSettings["Table"]);
            Console.WriteLine("End Copy");
            return;
        }
    }
}

Upvotes: 1

Views: 931

Answers (4)

Funka
Funka

Reputation: 4278

Would setting up a cursor, stepping through each row in your source table and "insert-into-selecting" for each fetched row, use less memory? The BOL has plenty of examples of stepping through cursors.


Update: Here's an example I copied/modified from the BOL T-Sql reference on FETCH: (the comments are from the BOL article, i just changed around a few names)

-- // Declare the variables to store the values returned by FETCH.
DECLARE @id uniqueidentifier, @xml varbinary(4000)

DECLARE myCursor CURSOR FOR
SELECT id, xml FROM Table1

OPEN myCursor 

-- // Perform the first fetch and store the values in variables.
-- // Note: The variables are in the same order as the columns
-- // in the SELECT statement. 

FETCH NEXT FROM myCursor
INTO @id, @xml

-- // Check @@FETCH_STATUS to see if there are any more rows to fetch.
WHILE @@FETCH_STATUS = 0
    BEGIN

    -- // Do something with your values!
    INSERT INTO db2..Table2 (id, xml) SELECT @id, @xml

    -- // This is executed as long as the previous fetch succeeds.
    FETCH NEXT FROM myCursor
    INTO @id, @xml

    END

CLOSE myCursor 
DEALLOCATE myCursor 

Upvotes: 1

shahkalpesh
shahkalpesh

Reputation: 33476

Is this a one time job? If so, you could use DTS or SSIS.
If not, see if you can use SQLBulkCopy & related classes in the framework

EDIT: I saw your code & can suggest using BatchSize property before the call to WriteToServer.

Upvotes: 2

Jason Horner
Jason Horner

Reputation: 3690

I think you want to set the batch size so that you can chunk them into manageable pieces.

Bulk copy batch size

Check this Bulk loading data whitepaper for other techniques: Bulk load

Upvotes: 1

skalb
skalb

Reputation: 5567

You could try using the BCP utility.

This can be run in C# using the Process class, if needed.

Upvotes: 2

Related Questions