Reputation: 569
I have a Vb.net application that is updating a table in a SQL Server database very frequently. The table has 143 columns and about 10,000 rows. The same procedure is required to update the table for several different modules so the data updated is different all the time, sometimes it could be just a few cells in a few rows other times it may be several hundred rows and several columns.
At times it's taking 15 to 30 seconds to update the information. That seams really long given that the table can be totally re-written with a bulk import in a second or 2 (I realise that that is beside the point). The database is set to simple recovery, the table has only one index. I have tried playing around with the update batch size to no noticeable improvement.
I'm using the below code to do the update. Is there anything that I can do to improve the speed?
Dim oMainQueryR As String
If DBSelectionsDS.HasChanges Then
Try
oMainQueryR = "SELECT * FROM DBSelections"
Using connection As New SqlConnection(RacingConStr)
Using oDataSQL As New SqlDataAdapter(oMainQueryR, connection)
oDataSQL.UpdateBatchSize = 100
Using cbT As SqlCommandBuilder = New SqlCommandBuilder(oDataSQL)
connection.Open()
oDataSQL.Update(DBSelectionsDS, "DBSelectionsDetails")
connection.Close()
End Using
End Using
End Using
DBSelectionsDS.Tables("DBSelectionsDetails").AcceptChanges()
Catch ex As Exception
ErrMess = "ERROR - occured " & ex.ToString
Call WriteError()
Call ViewError()
End Try
End If
Upvotes: 0
Views: 1749
Reputation: 3855
I would be willing to bet the bottleneck lies in two places:
First, You are selecting everything from your table every time you need to run the update. This will take longer and longer as your table grows in size. The SqlCommandBuilder
only needs a schema to work with so change your query string to this:
oMainQueryR = "SELECT * FROM DBSelections WHERE 0 = 1"
This will return only the schema and column names for the table but no rows, your DataSet
contains all the data information the CommandBuilder
needs to perform the update. If you are curious why this works, consider that 0 will never equal 1, so SQL says there are never any rows where 0 = 1, and just returns the schema instead.
Second, the UpdateBatchSize
is limiting the batch size of the update.
Consider having 200 rows of changed data in your DataSet
. You
will then have to take 2 trips to the database to finish the update.
Setting UpdateBatchSize = 0
will remove this limit, also you can
just remove the line as the default is 0.
Otherwise your bottleneck could be caused by another transaction locking the DBSelections
table. Ensure that if you are running any queries against that table while the update is happening, you will either want to use the with (nolock)
statement, or ensure that your update is the only transaction occurring at the time.
Upvotes: 3