vikram
vikram

Reputation: 1

SQL Transaction with ADO.Net

I am new to Database interection with C#, I am trying to writing 10000 records in database in a loop with the help of SqlCommand and SqlConnection objects with the help of SqlTransaction and committing after 5000. It is taking 10 seconds to processed.

SqlConnection myConnection = new SqlConnection("..Connection String..");
myConnection.Open();
SqlCommand myCommand = new SqlCommand();
myCommand.CommandText = "exec StoredProcedureInsertOneRowInTable Param1, Param2........";
myCommand.Connection = myConnection;
SqlTransaction myTrans = myConnection.Begintransaction();
for(int i=0;i<10000;i++)
{
mycommand.ExecuteNonQuery(); 
if(i%5000==0) 
{
myTrans.commit();  
myTrans = myConnection.BeginTransaction();  
mycommand.Transaction = myTrans;
}
}

Above code is giving me only 1000 rows write/sec in database.

But when i tried to implement same logic in SQL and execute it on Database with SqlManagement Studio the it gave me 10000 write/sec. When I compare the behaviour of above two approch then it showes me that while executing with ADO.Net there is large number of Logical reads.

my questions are: 1. Why there is logical reads in ADO.Net execution? 2. Is tansaction have some hand shaking? 3. Why they are not available in case of management studio? 4. If I want very fast insert transactions on DB then what will be the approach? .

Updated Information about Database objects

Table: tbl_FastInsertTest No Primary Key, Only 5 fields first three are type of int (F1,F2,F3) and last 2(F4,F5) are type varchar(30)

storedprocedure:

create proc stp_FastInsertTest 
{ 
@nF1 int,
 @nF2 int,
 @nF3 int,
 @sF4 varchar(30),
 @sF5 varchar(30) 
 }
 as  
 Begin
 set NoCOUNT on
       Insert into tbl_FastInsertTest
       {
         [F1],
         [F2],
         [F3],
         [F4],
         [F5]
       }
       Values
       {
         @nF1,
         @nF2,
         @nF3,
         @sF4,
         @sF5,
       } end
 --------------------------------------------------------------------------------------

SQL Block Executing on SSMS

--When I am executing following code on SSMS then it is giving me more than 10000 writes per second but when i tried to execute same STP on ADO than it gave me 1000 to 1200 writes per second

--while reading no locks

begin trans 
declare @i int 
set @i=0

While(1<>0) 
begin  
 exec stp_FastInsertTest 1,2,3,'vikram','varma'  
 set @i=@i+1 

 if(@i=5000)   
  begin    
   commit trans   
   set @i=0
   begin trans
  end

 end

Upvotes: 0

Views: 3466

Answers (3)

Emre Aydinceren
Emre Aydinceren

Reputation: 96

  • For performant inserts take a look at SqlBulkCopy class if it works for you it should be fast.
  • As Sean said, using parameterized queries is always a good idea.
  • Using a StringBuilder class, batching thousand INSERT statements in a single query and committing the transaction is a proven way of inserting data:

    var sb=new StringBuilder();

    for(int i=0;i < 1000;i++)
    { 
      sb.AppendFormat("INSERT INTO Table(col1,col2)
    

    VALUES({0},{1});",values1[i],values2[i]); }

    sqlCommand.Text=sb.ToString();
    
  • Your code doesn't look right to me, you are not committing transactions at each batch. Your code keeps opening new transactions.

  • It is always a good practice to drop indexes while inserting a lot of data, and adding them later. Indexes will slow down your writes.
  • Sql Management Studio does not have transactions but Sql has, try this:
 BEGIN TRANSACTION MyTransaction
    INSERT INTO Table(Col1,Col1) VALUES(Val10,Val20);
    INSERT INTO Table(Col1,Col1) VALUES(Val11,Val21);
    INSERT INTO Table(Col1,Col1) VALUES(Val12,Val23);
    COMMIT TRANSACTION

Upvotes: 2

Sean
Sean

Reputation: 786

You need to use a parameterized query so that the execution path can get processed and cached. Since you're using string concatenation (shudder, this is bad, google sql injection) to build the query, SQL Server treats those 10,000 queries are separate, individual queries and builds an execution plan for each one.

MSDN: http://msdn.microsoft.com/en-us/library/yy6y35y8.aspx although you're going to want to simplify their code a bit and you'll have to reset the parameters on the command.

If you really, really want to get the data in the db fast, think about using bcp... but you better make sure the data is clean first (as there's no real error checking/handling on it.

Upvotes: 0

Marc Gravell
Marc Gravell

Reputation: 1064204

If you are running something like:

exec StoredProcedureInsertOneRowInTable 'blah', ...
exec StoredProcedureInsertOneRowInTable 'bloop', ...
exec StoredProcedureInsertOneRowInTable 'more', ...

in SSMS, that is an entirely different scenario, where all of that is a single batch. With ADO.NET you are paying a round-trip per ExecuteNonQuery - I'm actually impressed it managed 1000/s.

Re the logical reads, that could just be looking at the query-plan cache, but without knowing more about StoredProcedureInsertOneRowInTable it is impossible to comment on whether something query-specific is afoot. But I suspect you have some different SET conditions between SSMS and ADO.NET that is forcing it to use a different plan - this is in particular a problem with things like persisted calculated indexed columns, and columns "promoted" out of a sql-xml field.

Re making it faster - in this case it sounds like a table-valued parameters is exactly the thing, but you should also review the other options here

Upvotes: 2

Related Questions