Belliez
Belliez

Reputation: 5376

Optimise and speed up very slow Linq / SQL code

I have the following C# method which I use to pre-load a data table of stock. Although it works very well, now I have many rows in the table, it can be very very slow to load.

Please can someone recommend a better and much faster way to do this? (Ideally to remove the "foreach" code as this is the slow bit!).

public static DataTable GetProducts()
{
    DataTable table = new DataTable();

    using (DataClassesDataContext data = new DataClassesDataContext(cDbConnection.GetConnectionString()))
    {
        var query = (from p in data.Products
                     where p.Deleted == false
                     join s in data.ProductStocks on p.ProductID equals s.ProductID
                     group s by p into g
                     select new { g });


        table.Columns.Add("Barcode", typeof(string));
        table.Columns.Add("Stock Code", typeof(string));
        table.Columns.Add("Description", typeof(string));
        table.Columns.Add("Price", typeof(string));
        table.Columns.Add("Tax", typeof(string));
        table.Columns.Add("Stock", typeof(string));
        table.Columns.Add("Service Item", typeof(bool));
        table.Columns.Add("Deduct Stock", typeof(bool));

        if (query != null)
        {
            foreach (var item in query)
            {
                try
                {
                    decimal? Tax = 0;
                    if (item.g.Key.ProductTax != null)
                    {
                        Tax = Common.Utilities.IsValueValidDecimal(item.g.Key.ProductTax.TaxRate, 0);   // Tax
                    }
                    else
                    {
                        Tax = 0;
                    }

                    bool DeductStock = !Convert.ToBoolean(item.g.Key.ServiceItem);

                    string[] row = new string[] 
                    {
                        item.g.Key.EANCode.ToString(),       // Barcode
                        item.g.Key.OurStockCode.ToString(),  // Product Code
                        item.g.Key.Description.ToString(),   // desc
                        GetGUIDisplayPrice(item.g.Key.RetailPrice, item.g.Key.RetailPriceExVAT),  // cost
                        Tax.ToString(),                         // Tax   
                        item.g.Sum(s => s.QtyOnHand).ToString(), // Stock
                        item.g.Key.ServiceItem.ToString(),   // Service Item (non-stock)
                        DeductStock.ToString()                  // if not a service item, the its a stocked item so deduct!
                    };

                    table.Rows.Add(row);
                }
                catch (Exception ex)
                {
                }
            }
        }//ENDIF NULL
    }//END USING

    return table;
}

Upvotes: 0

Views: 178

Answers (2)

Belliez
Belliez

Reputation: 5376

In the end, this function was turned into a stored procedure and the stored procedure returned the table that was created on the server, not the client. This was almost instant and massive performance improvement!

Upvotes: 0

Remus Rusanu
Remus Rusanu

Reputation: 294207

from p in data.Products
                 where p.Deleted == false
                 join s in data.ProductStocks on p.ProductID equals s.ProductID
                 group s by p into g
                 select new { g }

What is the schema of the Products and ProductStocks tables? What indexes do you have? start by reading How to analyse SQL Server performance.

Some things stand out immediate:

  • you are fetching all the data from the server on the client. Don't. Process on the back end.
  • using a Deleted bit field is a recipe for (performance) disaster. You can add it a clustered index leftmost key, with dubious results at best. Partitioning by can help, but not much. There is no silver bullet. Try to eliminate this requirement. Delete the deleted rows.

There is not much room to optimize. Stop fetching all the data.

Upvotes: 2

Related Questions