user2755680
user2755680

Reputation: 95

Calling SQL select statement from C# thousands of times and is very time consuming. Is there a better way?

I get a list of ID's and amounts from a excel file (thousands of id's and corresponding amounts). I then need to check the database to see if each ID exists and if it does check to make sure the amount in the DB is greater or equal to that of the amount from the excel file.

Problem is running this select statement upwards of 6000 times and return the values I need takes a long time. Even at a 1/2 of a second a piece it will take about an hour to do all the selects. (I normally dont get more than 5 results max back)

Is there a faster way to do this?

Is it possible to somehow pass all the ID's at once and just make 1 call and get the massive collection?

I have tried using SqlDataReaders and SqlDataAdapters but they seem to be about the same (too long either way)

General idea of how this works below

for (int i = 0; i < ID.Count; i++)
{
    SqlCommand cmd = new SqlCommand("select Amount, Client, Pallet from table where ID = @ID and Amount > 0;", sqlCon);

    cmd.Parameters.Add("@ID", SqlDbType.VarChar).Value = ID[i];

    SqlDataAdapter da = new SqlDataAdapter(cmd);

    da.Fill(dataTable);
    da.Dispose();           
}

Upvotes: 6

Views: 2471

Answers (12)

Dinesh Maind
Dinesh Maind

Reputation: 349

There is another way to handle this, Making XML of IDs and pass it to procedure. Here is code for procedure.

IF OBJECT_ID('GetDataFromDatabase') IS NOT NULL
    BEGIN
        DROP PROCEDURE GetDataFromDatabase
    END
GO

--Definition
CREATE PROCEDURE GetDataFromDatabase
@xmlData XML
AS
BEGIN
DECLARE @DocHandle INT
DECLARE @idList Table (id INT)

EXEC SP_XML_PREPAREDOCUMENT  @DocHandle OUTPUT, @xmlData;   

INSERT INTO @idList (id) SELECT x.id FROM OPENXML(@DocHandle, '//data', 2) WITH ([id] INT) x

EXEC SP_XML_removeDOCUMENT  @DocHandle ;
--SELECT * FROM @idList

SELECT t.Amount, t.Client, t.Pallet FROM yourTable t INNER JOIN  @idList x ON t.id = x.id and t.Amount > 0;
END
GO

--Uses
EXEC GetDataFromDatabase @xmlData = '<root><data><id>1</id></data><data><id>2</id></data></root>'

You can put any logic in procedure. You can pass id, amount also via XML. You can pass huge list of ids via XML.

Upvotes: 1

Jon Raynor
Jon Raynor

Reputation: 3892

IN clause has limits, so if you go with that approach, make sure a batch size is used to process X amount of Ids at a time, otherwise you will hit another issue.

A @Robertharvey has noted, if there are not a lot of IDs and there are no transactions occurring, then just pull all the Ids at once into memory into a dictionary like object and process them there. Six thousand values is not alot and a single select could return all those back within a few seconds.

Just remember that if another process is updating the data, your local cached version may be stale.

Upvotes: 1

Mladen Prajdic
Mladen Prajdic

Reputation: 15677

Use Table-Valued Parameters

With them you can pass a c# datatable with your values into a stored procedure as a resultset/table which you can join to and do a simple:

SELECT * 
FROM YourTable 
WHERE NOT EXISTS (SELECT * FORM InputResultSet WHERE YourConditions)

Upvotes: 2

Andrew
Andrew

Reputation: 7778

You can select the whole resultset (or join multiple 'limited' result sets) and save it all to DataTable Then you can do selects and updates (if needed) directly on datatable. Then plug new data back... Not super efficient memory wise, but often is very good (and only) solution when working in bulk and need it to be very fast. So if you have thousands of records, it might take couple of minutes to populate all records into the DataTable

then you can search your table like this:

string findMatch = "id = value";
DataRow[] rowsFound = dataTable.Select(findMatch);

Then just loop foreach (DataRow dr in rowsFound)

Upvotes: 0

Anton
Anton

Reputation: 11

SqlDataAdapter objects too heavy for that. Firstly, using stored procedures, it will be faster. Secondly, use the group operation, for this pass as a parameter to a list of identifiers on the side of the database, run a query on these parameters, and return the processed result. It will quickly and efficiently, as all data processing logic is on the side of the database server

Upvotes: 0

Jonny Cundall
Jonny Cundall

Reputation: 2612

You can import the data from the excel file into SQL server as a table (using the import data wizard). Then you can perform a single query in SQL server where you join this table to your lookup table, joining on the ID field. There's a few more steps to this process, but it's a lot neater than trying to concatenate all the IDs into a much longer query.

I'm assuming a certain amount of access privileges to the server here, but this is what I'd do given the access I normally have. I'm also assuming this is a one off task. If not, the import of the data to SQL server can be done programmatically as well

Upvotes: 1

AaronLS
AaronLS

Reputation: 38365

"I have tried using SqlDataReaders and SqlDataAdapters"

It sounds like you might be open to other APIs. Using Linq2SQL or Linq2Entities:

var someListIds = new List<int> { 1,5,6,7 }; //imagine you load this from where ever
db.MyTable.Where( mt =>  someListIds.Contains(mt.ID) );

This is safe in terms of avoiding potential SQL injection vulnerabilities and will generate a "in" clause. Note however the size of the someListIds can be so large that the SQL query generated exceeds limits of query length, but the same is true of any other technique involving the IN clause. You can easily workaround that by partitioning lists into large chunks, and still be tremendously better than a query per ID.

Upvotes: 2

Martin Smith
Martin Smith

Reputation: 453328

Instead of a long in list (difficult to parameterise and has a number of other inefficiencies regarding execution plans: compilation time, plan reuse, and the plans themselves) you can pass all the values in at once via a table valued parameter.

See arrays and lists in SQL Server for more details.

Generally I make sure to give the table type a primary key and use option (recompile) to get the most appropriate execution plans.

Upvotes: 13

sam yi
sam yi

Reputation: 4934

Import the data directly to SQL Server. Use stored procedure to output the data you need.

If you must consume it in the app tier... use xml datatype to pass into a stored procedure.

Upvotes: 1

Servy
Servy

Reputation: 203834

Rather than performing a separate query for every single ID that you have, execute one query to get the amount of every single ID that you want to check (or if you have too many IDs to put in one query, then batch them into batches of a few thousand).

Upvotes: 1

BCartolo
BCartolo

Reputation: 719

Use the in operator. Your problem is very common and it has a name: N+1 performance problem

Where are you getting the IDs from? If it is from another query, then consider grouping them into one.

Upvotes: 1

Niels Keurentjes
Niels Keurentjes

Reputation: 41958

Combine all the IDs together into a single large IN clause, so it reads like:

select Amount, Client, Pallet from table where ID in (1,3,5,7,9,11) and Amount > 0;

Upvotes: 5

Related Questions