Reputation: 22652
We have a operation in which more than 100.000 records are read from a csv file and inserted in a database. When I am using a file with 10 records, the operation is completed successfully in less than one minute.
When I use 100.000 records, I am getting the following error “Invalid attempt to call Read when reader is closed.” after 10 minutes. Is there any Timeout that I can configure to avoid this error?
Note: The CommandTimeout is already set as zero.
DbCommand cmd = db.GetStoredProcCommand("aspInsertZipCode");
cmd.CommandTimeout = 0;
dataStringToProcess.Remove(dataStringToProcess.Length - 1, 1);
db.AddInParameter(cmd, "@DataRows", DbType.String, dataStringToProcess.ToString());
db.AddInParameter(cmd, "currDate", DbType.DateTime, DateTime.Now);
db.AddInParameter(cmd, "userID", DbType.Int32, UserID);
db.AddOutParameter(cmd, "CountOfUnchangedZipCode", DbType.String, 1000);
DbDataReader rdr = null;
try
{
rdr = (DbDataReader)db.ExecuteReader(cmd);
if (rdr.Read())
{
if (!String.IsNullOrEmpty(Utility.GetString(rdr, "NewZipCode")))
strNewZipCode = strNewZipCode + "," + Utility.GetString(rdr, "NewZipCode");
}
rdr.NextResult();
if (rdr.Read())
{
strRetiredZipCode = strRetiredZipCode + "," + Utility.GetString(rdr, "RetiredZipCode");
}
int TempUnchageZipCount = Convert.ToInt32(db.GetParameterValue(cmd, "CountOfUnchangedZipCode"));
CountOfUnchangedZipCode = CountOfUnchangedZipCode + TempUnchageZipCount;
dataStringToProcess = new StringBuilder();
cntRec = 0;
}
catch
{
if (rdr != null && (!rdr.IsClosed))
rdr.Close();
throw;
}
finally
{
if (rdr != null && (!rdr.IsClosed))
rdr.Close();
}
cmd.Dispose();
Upvotes: 1
Views: 2031
Reputation: 22652
It was actually MSDTC & Transaction issue. I updated the MSDTC timeout in Component Services. Also I updated timeout values in System.Transactions in machine.config of the application server.
Upvotes: 1
Reputation: 48415
Could be a database timeout instead... you checked that?
What are you actually trying to do anyway? If you really need to read so many rows perhaps consider reducing it into smaller chunks - say, only read 1000 at a time
Upvotes: 2