Reputation: 473
Edit:
My problem is not a problem anymore: I have redo my performances tests and I have do a fatal stupid error: I had forget a x1000 to get seconds from milliseconds :/
Sorry for that guys.
For info:
- I do some 1900 updates per second from my PC to the DataBase server on local network.
- 3.200 updates per second if the programs is on same machine than DB.
- 3.500 updates per second from my PC on the DataBase server I do not re-create and re-open a new SQLConnection.
- 5.800 updates per second with a batch text.
For my 10.000 rows, if it take 5 seconds, it is ok for my programs. Sorry to have worry you.
Actually, I use a SQL stored prodedure to create a row in my database to avoid SQL-injection. In C# I have the following method:
public void InsertUser(string userCode)
{
using (SqlConnection sqlConnection = new SqlConnection(this.connectionString))
{
SqlCommand sqlCommand = new SqlCommand("InsertUser", sqlConnection);
sqlCommand.CommandType = System.Data.CommandType.StoredProcedure;
sqlCommand.Parameters.Add(new SqlParameter("@UserCode", userCode));
sqlConnection.Open();
sqlCommand.ExecuteNonQuery();///0.2 seconds !ERROR HERE! 0.2ms here,NOT 0.2sec!!!
}
}
It woks great when i have one or two rows to insert. But if i need to create 1.000 users and 10.000 products and 5000 pets, it is not the best solution: I will loose a huge time in netwok transport.
I believe, without checkin it, that I can use just a limited amount of callback. So I do not want to call 10.000 times:
sqlCommand.BeginExecuteNonQuery()
Another way will be to create a batch text, but there is a SQL-Injection risk (and it is ugly).
Does there is a 'SqlCommandList' object that manage that in .Net? How do I do large writing in database? What the good patern for that?
Upvotes: 2
Views: 18888
Reputation: 415820
This should run a little faster:
public void InsertUser(IEnumerable<string> userCodes)
{
using (SqlConnection sqlConnection = new SqlConnection(this.connectionString),
SqlCommand sqlCommand = new SqlCommand("InsertUser", sqlConnection))
{
sqlCommand.CommandType = System.Data.CommandType.StoredProcedure;
SqlParameter param = sqlCommand.Parameters.Add("@UserCode", SqlDbTypes.VarChar);
sqlConnection.Open();
foreach(string code in userCodes)
{
param.Value = code;
sqlCommand.ExecuteNonQuery();///0.2 seconds
}
}
}
That will only open one connection and only create one command, even if you pass it 1000 users. It will still do each insert separately, though. And of course if userCode isn't a string you'll want to re-factor it appropriately. You may also want to look into SQL Server's BULK INSERT command.
Upvotes: 10
Reputation: 11
I'm guessing this is a pretty old question.
With SQL Server 2008 the answer now is to use a Table Value Parameter. In short, pass in all your variables in a used defined type (table).
In SQL, you can now process all of the records as individual items...Actually use set logic and get real performance.
Upvotes: 1
Reputation: 432271
What about UpdateBatchSize of the SQLDataAdaptor?
Our front end guys use this to batch a few 10,000 proc calls into chunks
Our environment disallows "bulkadmin" rights so we can't use BULKINSERT/bcp etc
Upvotes: 4
Reputation:
As per some of the answers above, the most noticeable performance increase for the least effort involves 2 changes to your existing code:
BULK INSERTs are an option but probably overkill for what you want to do.
Upvotes: 0
Reputation: 131112
Based off Joel's answer, this is the fastest solution short of using either SqlBulkCopy or creating big strings of messy SQL and executing. (I added a transaction which will improve performance quite a lot)
public void InsertUser(IEnumerabler<string> userCodes)
{
using (SqlConnection sqlConnection = new SqlConnection(this.connectionString))
{
sqlConnection.Open();
SqlTransaction transaction = connection.BeginTransaction();
SqlCommand sqlCommand = new SqlCommand("InsertUser", sqlConnection);
sqlCommand.Transaction = transaction;
sqlCommand.CommandType = System.Data.CommandType.StoredProcedure;
SqlParameter param = sqlCommand.Parameters.Add("@UserCode", SqlDbTypes.VarChar);
foreach(string code in userCodes)
{
param.Value = code;
sqlCommand.ExecuteNonQuery();
}
transaction.Commit();
}
}
Upvotes: 2
Reputation: 97691
If you really were concerned about this, you could (like you said) batch the commands up in strings like so:
var cmd = new SqlCommand();
cmd.Connection = sqlConnection;
for (int i = 0; i < batchSize; i++) {
cmd.CommandText += String.Format("EXEC InsertUser @UserCode{0};", i);
cmd.Parameters.AddWithValue("@UserCode" + i.ToString(), XXXXX);
//... etc ...
}
Because in this scheme, you'd be using a parameter, you don't have more risk of SQL injection than if you used a stored proc. But I question whether or not you'll really save an appreciable amount of time doing this. IMO you should just keep it simple and do it the way you are doing it now.
Upvotes: 2
Reputation: 51711
"it is not the best solution: I will loose a huge time in netwok transport" Can you live with the loss?
If this is something you don't do often, then does it matter? Measure it first, if it's a problem then fix it, personally probably I'd go with Marc Gravells table for incoming inserts. Another option is to fire the inserts asynchronously, then you're not waiting on each to finish before you start the next.
It took me years, but finally I figured out that I shouldn't waste time optimising code that doesn't need it.
Hope this helps (even though I don't think it will, sorry).
Upvotes: 0
Reputation: 1062855
Personally, if I regularly expect to do fairly large inserts (10,000 rows would definitely qualify...), I might consider having a separate table for incoming data, and use SqlBulkCopy
to populate this table. Then you just execute a single stored procedure that moves the data over into the real table.
Another approach is to send down xml to the database, and use sqlxml to parse that (much easier with SQL2005 and above) - but this puts extra work on the db server.
Upvotes: 2
Reputation: 7710
Have you considered passing an XML document to a stored procedure, then iterating through that to find the data to insert?
Upvotes: 0