Reputation: 9644
I'm using a C# function to query the DB based on SqlConnection
, SqlCommand
and SqlDataAdapter
but performances are quite bad when I try to retrieve a huge number of rows (few millions).
using (SqlConnection mySqlConnection = new SqlConnection(connectionString))
using (SqlCommand mySqlCommand = new SqlCommand(query, mySqlConnection)){
mySqlConnection.Open();
DataSet myDataSet = new DataSet();
using (SqlDataAdapter mySqlDataAdapter = new SqlDataAdapter(mySqlCommand)) {
affectedRow = mySqlDataAdapter.Fill(myDataSet);
}
}
Is there a way to optimize this query for big data set?
Just as a comparison this piece of VBA code requires only 4-5 versus 20-22 seconds of above C#
Set cnt = New ADODB.Connection
Set rst = New ADODB.Recordset
Set cmd = New ADODB.Command
cnt.Open
Set rst = cnt.Execute(queryString)
Dim nDimArray As Variant
nDimArray = rst.GetRows
cnt.Close
Upvotes: 0
Views: 497
Reputation: 21492
Don't use DataSet. Use a SqlReader which is similar to what you were doing in VBA.
using (var mySqlConnection = new SqlConnection(connectionString))
using (var mySqlCommand = new SqlCommand(query, mySqlConnection))
{
mySqlConnection.Open();
var reader = mySqlCommand.ExecuteReader();
while (reader.Read())
{
}
reader.Close();
}
More information here: https://msdn.microsoft.com/en-us/library/9kcbe65k%28v=vs.110%29.aspx
You should be able to do this as well, which will clean up after yourself just in case you have exceptions thrown during the read:
using (var mySqlConnection = new SqlConnection(connectionString))
{
using (var mySqlCommand = new SqlCommand(query, mySqlConnection))
{
mySqlConnection.Open();
using (var reader = mySqlCommand.ExecuteReader())
{
while (reader.Read())
{
}
}
}
}
Upvotes: 6