Chad Levy
Chad Levy

Reputation: 10140

Performance implications of opening multiple SQLConnections in poor-man's ORM

I'm implementing a poor-man's ORM an a legacy application (.NET 2.0 web app, hand-coded SQL queries). I have two data classes: Customer and Order:

Public Class Customer
    Public Property CustomerId As Integer
    Public Property CustomerName As String
    Public Property Orders As List(Of Order)
End Class

Public Class Order
    Public Property OrderId As Integer
    Public Property OrderItem As String
End Class

I'm using a SqlDataReader to manually map the SQL results to an instance of the Customer class:

Dim connection As New SqlConnection(connectionString)
Dim command As New SqlCommand(sql, connection)
Dim reader As SqlDataReader()

connection.Open()

reader = command.ExecuteReader()

Dim customer As New Customer()

While reader.Read
    With customer
        .CustomerId =   reader("CustomerId")
        .CustomerName = reader("CustomerName")
        .Orders =       getOrdersByCustomerId(reader("CustomerId"))   ' get orders
    End With
End While

connection.Close()

To populate the Customer.Orders, I call a function that returns a List(Of Order):

Private Function getOrdersByCustomerId(ByVal customerId As Integer) As List(Of Order)

    Dim connection As New SqlConnection(connectionString)
    Dim command As New SqlCommand(sql, connection)
    Dim reader As SqlDataReader()

    connection.Open()

    reader = command.ExecuteReader()

    Dim orders As New List(Of Order)

    While reader.Read
        Dim order As New Order
        With order
            .OrderId =   reader("OrderId")
            .OrderItem = reader("OrderItem")
        End With
        orders.Add(order)
    End While

    connection.Close()

    Return orders

End Function

I'm concerned about the performance of this method, namely if I'm pulling multiple Customer's. For each Customer pulled, I have to hit the database again (opening another connection), and getting the orders for that given customer. If I'm not mistaking, it wouldn't take many records to accumulate a large number of open connections.

My question is two-fold:

  1. How can I determine the number of new connections being made and whether or not I'll encounter any connection pooling or other such .NET-imposed limits?
  2. Is there a better way? (see comment below)

One thought I've had is to pass the SQLConnection object created in the calling class to the getOrdersByCustomerId function and have that function use the (apparently?) already-open connection. I have not tested it namely because I don't know how to determine if it's better than my existing method. Thoughts?


Background:

I'm creating a search web service that returns JSON for processing by the client. The service takes a single search parameter, performs multiple lookups on different tables, then returns a custom JSON object. For example, if the user puts in what appears to be a name, I search both the Customers table for a list of the top n customers and the Orders table for the top n orders that have a customer with that name.

Upvotes: 0

Views: 502

Answers (4)

Steve
Steve

Reputation: 216313

Working on the database side of this question.
You don't show anything of the sql command used to retrieve records

For example when you load your customer by ID try to load also its orders.

SELECT CUSTOMERS.*, ORDERS.* 
FROM CUSTOMERS LEFT JOIN ORDERS 
     ON CUSTOMERS.ID = ORDERS.CUSTOMERID
WHERE CUSTOMERS.ID = @custID

Having this data in memory create the Customer and its Orders without going twice to the database.

Upvotes: 2

Tony Hopkinson
Tony Hopkinson

Reputation: 20320

Don't worry about connections unless you've turned pooling off, and you wouldn't be asking this question if you had.

If you want to make things a bit clearer though a using block would be a good idea.

Using connection = new SqlConnection(ConnectionString)
' Insert code here 
End Using

connection will be disposed at end using, because you are on default behavior, the connection goes back into the pool to be reused next time your code(actually pools are done by appdomain) needs one. If nothing needs it for the default time period (2 mins I seem to remember) it gets completely disposed at that point.

PS it's a good idea to use using blocks for anything that implements IDisposable that is local to a method.

Hitting the server for each order, there's no definitive answer. Getting say a page of orders , and then creating a collection of order objects from it would enhance performance, as long as you aren't getting a load of stuff you are unlikely to need. ie only get the data you need for display, if there's load's more use some sort of lazy loading scheme and get the data when you need it, or have different classes e.g OrderSummary, OrderDetail.

Upvotes: 2

user1425057
user1425057

Reputation:

First is you are missing the Using Statement, So your code should be like this instead of relying on Garbage Collector to dispose it automatically.

Using connection As New SqlConnection("connectionString")
    Using command As New SqlCommand("sql", connection)
        Using reader As SqlDataReader = Nothing

            connection.Open()

            reader = command.ExecuteReader()

            Dim customer As New Customer()

            While reader.Read
                If True Then
                    customer.CustomerId = reader("CustomerId")
                    customer.CustomerName = reader("CustomerName")
                    customer.Orders = getOrdersByCustomerId(reader("CustomerId"))
                End If
            End While
            connection.Close()
        End Using
    End Using
End Using

Back to the original query about Performance

I would use DataTable in case the records are in large amount. Keeping in mind that the DataTable object is being disposed after use by Using Statements in the similar way as mentioned above. So there is no need to do Iteration for adding records.

Going to the Database again

Can you consider using ViewState or Session here and in case you find any changes in the data you can also use SQLDependency.

getOrdersByCustomerId(reader("CustomerId"))

Is it possible for you to bring all customer related information whichever is being fetching on form load and preserve it in ViewState? Going to Database over and over again is not preferred in the Data Reader. So, finally access the ViewState variable in the DataReader Iteration.

Upvotes: 3

Andomar
Andomar

Reputation: 238176

Opening and reopening a connection is virtually free. Thus it's good practice to open a connection and dispose it ASAP, preferably in a using block.

You can test for connection leaks by adding Max Pool Size=1; to your connection string. That will limit the number of available connections to 1. You'll get an error if you leak a connection. Be sure not to use this setting in production.

For performance, you'd normally avoid hitting the database for every row in a large set. So you could write a new query or procedure that returns all orders for all relevant customers. However in practice that's an unusual query. You'd normally present a list of customers and only when the end user zooms in on that customer would you retrieve the list of orders.

Upvotes: 3

Related Questions