TJF
TJF

Reputation: 2258

Improve .NET/MSSQL select & update performance

I'd like to increase performance of very simple select and update queries of .NET & MSSQL 2k8. My queries always select or update a single row. The DB tables have indexes on the columns I query on. My test .NET code looks like this:

      public static MyData GetMyData(int accountID, string symbol)
    {
            using (var cnn = new SqlConnection(connectionString))
            {
                cnn.Open();

                var cmd = new SqlCommand("MyData_Get", cnn);
                cmd.CommandType = CommandType.StoredProcedure;

                cmd.Parameters.Add(CreateInputParam("@AccountID", SqlDbType.Int, accountID));
                cmd.Parameters.Add(CreateInputParam("@Symbol", SqlDbType.VarChar, symbol));

                SqlDataReader reader = cmd.ExecuteReader();

                while (reader.Read())
                {
                    var MyData = new MyData();
                    MyData.ID = (int)reader["ID"];
                    MyData.A = (int)reader["A"];
                    MyData.B = reader["B"].ToString();
                    MyData.C = (int)reader["C"];
                    MyData.D = Convert.ToDouble(reader["D"]);
                    MyData.E = Convert.ToDouble(reader["E"]);
                    MyData.F = Convert.ToDouble(reader["F"]);

                    return MyData;
                }
            }
    }

and the according stored procedure looks like this:

     PROCEDURE [dbo].[MyData_Get] 
 @AccountID int,
 @Symbol  varchar(25)
AS
BEGIN
 SET NOCOUNT ON;
  SELECT p.ID, p.A, p.B, p.C, p.D, p.E, p.F FROM [MyData] AS p WHERE p.AccountID = @AccountID AND p.Symbol = @Symbol
END

What I'm seeing if I run GetMyData in a loop, querying MyData objects, I'm not exceeding about ~310 transactions/sec. I was hoping to achieve well over a 1000 transactions/sec.

On the SQL Server side, not really sure what I can improve for such a simple query. ANTS profiler shows me that on the .NET side, as expected, the bottleneck is cnn.Open and cnn.ExecuteReader, however I have no idea how I could significantly improve my .NET code?

I've seen benchmarks though where people seem to easily achieve 10s of thousands transactions/sec.

Any advice on how I can significantly improve the performance for this scenario would be greatly appreciated!

Thanks,

Tom

EDIT:

Per MrLink's recommendation, adding "TOP 1" to the SELECT query improved performance to about 585 transactions/sec from 310

EDIT 2:

Arash N suggested to have the select query "WITH(NOLOCK)" and that dramatically improved the performance! I'm now seeing around 2500 transactions/sec

EDIT 3:

Another slight optimization that I just did on the .NET side helped me to gain another 150 transactions/sec. Changing while(reader.Read()) to if(reader.Read()) surprisingly made quite a difference. On avg. I'm now seeing 2719 transactions/sec

Upvotes: 2

Views: 2588

Answers (9)

Justin
Justin

Reputation: 86729

I was hoping to achieve well over a 1000 transactions/sec [when running GetMyData in a loop]

What you are asking for is for GetMyData to run in less than 1ms - this is just pointless optimisation! At the bare minimum this method involves a round trip to the database server (possibly involving network access) - you wouldn't be able to make this method much faster if your query was SELECT 1.

If you have a genuine requirement to make more requests per second then the answer is either to use multiple threads or to buy a faster PC.

There is absolutely nothing wrong with your code - I'm not sure where you have seen people managing 10,000+ transactions per second, but I'm sure this must have involved multiple concurrent clients accessing the same database server rather than a single thread managing to execute queries in less than a 10th of a ms!

Upvotes: 1

Arash N
Arash N

Reputation: 324

Try using WITH(NOLOCK) in your SELECT statement to increase the performance. This would select the row without locking it.

SELECT p.ID, p.A, p.B, p.C, p.D, p.E, p.F FROM [MyData] WITH(NOLOCK) AS p WHERE p.AccountID = @AccountID AND p.Symbol = @Symbol

Upvotes: 1

Nicholas Carey
Nicholas Carey

Reputation: 74267

In no particular order...

  • Have you (or your DBAs) examined the execution plan your stored procedure is getting? Has SQL Server cached a bogus execution plan (either due to oddball parameters or old stats).

  • How often are statistics updated on the database?

  • Do you use temp tables in your stored procedure? If so, are they create upfront. If not, you'll be doing a lot of recompiles as creating a temp table invalidates the execution plan.

  • Are you using connection pooling? Opening/Closing a SQL server connection is an expensive operation.

  • Is your table clustered on accountID and Symbol?

Finally...

  • Is there a reason you're hitting this table by account and symbol rather than, say, just retrieving all the data for an entire account in one fell swoop?

Upvotes: 0

quentin-starin
quentin-starin

Reputation: 26638

Try re-using the Command and Prepare'ing it first.

I can't say that it will definitely help, but it seems worth a try.

Upvotes: 0

quentin-starin
quentin-starin

Reputation: 26638

Make sure your database connections are actually pooling. If you are seeing a bottleneck in cnn.Open, there would seem to be a good chance they are not getting pooled.

Upvotes: 1

MrLink
MrLink

Reputation: 121

Some things to consider.

First, your not closing the server connection. (cnn.Close();) Eventually, it will get closed by the garbage collector. But until that happens, your creating a brand new connection to the database every time rather than collecting one from the connection pool.

Second, Do you have an index set in Sql Server covering the AccountID and Symbol columns?

Third, While accountId being and int is nice and fast. The Symbol column being varchar(25) is always going to be much slower. Can you change this to an int flag?

Upvotes: 1

Jason Goemaat
Jason Goemaat

Reputation: 29214

You could use output parameters instead of a select since there's always a single row.

You could also create the SqlCommand ahead of time and re-use it. If you are executing a lot of queries in a single thread you can keep executing it on the same connection. If not, you could create a pool of them or do cmdTemplate.Clone() and set the Connection.

Upvotes: 0

Robin Orheden
Robin Orheden

Reputation: 2764

If the data is not frequently invalidated (updated) I would implement a cache layer. This is one of the most effective ways (if used correctly) to gain performance.

Upvotes: 0

Boris Callens
Boris Callens

Reputation: 93327

Is your method called frequently? Could you batch your requests so you can open your connection, create your parameters, get the result and reuse them for several queries before closing the whole thing up again?

Upvotes: 0

Related Questions