ixashish
ixashish

Reputation: 13

update query takes too much time

I have dataset with 10 million data and I am looping through it and using values of dataset for updating the base table of count 25 million.

the fields which is used in where clause is primary key.

but still I am not getting performance only 100k records getting updated in 1 hr.

how do I optimize and speed up time to execute query in oracle.

I am using command object, Connection object in vb.net

following function executed 10 million time to update query.

public cmd_obj as new oledb.oledbcommand
Strquery="update Table1 set field1='Value from dataset' where field2='value from dataset'" ' where field2 is primary key in base table
Public Function Executenonquery(ByVal Strquery As String) As Int32
  Dim intUpdated As Int32 = 0
  Try
    If VerifyOracleLogin() = True Then ' check oracle connection
      cmd_obj.Connection = dbconn
      cmd_obj.CommandText = Strquery 'strquery has update query
      intUpdated = cmd_obj.ExecuteNonQuery()
      return intUpdated
    End If   
  Catch ex As Exception
    WriteToErrorLog("Query Execution Error : " & Strquery, ex.Message)
    Return -1
  End Try
End Function

Upvotes: 0

Views: 1109

Answers (3)

Joel Coehoorn
Joel Coehoorn

Reputation: 416149

Looking at your query:

"update Table1 set field1='Value from dataset' where field2='value from dataset'"

I first wondered how you're substituting in those database values... If I had to guess, I'd bet you're doing it in a way that's vulnerable to sql injection. But then I realized it didn't matter, because it's likely that you have no business working with a dataset here at all. I bet you're going through every row in the dataset one by one and running essentially the same update query, and if that's true, then we can almost certainly write a query for you that will go out to the database and update every record you want in one statement, such that you never pull the dataset back to your computer at all. Such a query would run in very small fraction of the time you need right now... but to do that, we need to see a bit more code first.

Upvotes: 1

Max
Max

Reputation: 4077

There are multiple ways to handle this situation:-

  1. Create a stored procedure and pass the list of primary keys and the values to be set as arrays. In the stored procedure, loop through the array and execute the update statement using exception handling with it. This way, even if one record fail, then the rest of the update statements can be executed.
  2. Another way can be to execute the scripts as 'NeverHopeless' suggested (appending the queries). But use the Log Error statement also after each update statement. Helpful documentation :- http://www.oracle-base.com/articles/10g/dml-error-logging-10gr2.php

The bottleneck here might be the time taken for the .Net code to open an Oracle connection. So, it is advisable to send as much data as possible in one go rather than sending them one by one.

Upvotes: 0

NeverHopeless
NeverHopeless

Reputation: 11233

As you have mentioned:

i have dataset with 1 crore data and i am looping through it

I would recommend to try insert using OpenXML. Have a look at this sample.

EDIT:

See this one also. Here SQL OpenXml is translated in Oracle using openedXml.

Also, If you are using DataSet so why not you use DataSet.Update method ?

Upvotes: 2

Related Questions