Reputation: 4406
I have an excel file that enters through my MVC web app that I have to process and do things with. So I receive my file on the controller
public class StripExcelDocument
{
public DataSet Convert(HttpPostedFileBase file)
{
return GetDataFromExcel(file.InputStream);
}
private DataSet GetDataFromExcel(Stream target)
{
var excelReader = ExcelReaderFactory.CreateOpenXmlReader(target);
excelReader.IsFirstRowAsColumnNames = true;
return excelReader.AsDataSet();
}
}
and I send it through a processor I have created that is just a large conditional statement and then based on the outcome it gets sent to a specific table in a database.
public class Processor{
public Result Process
{
if (FirstCondition(string foo, int bar)){
SetResult(foo, bar);
}
if (SecondCondition(string foo, int bar)){
SetResult(foo, bar);
}
if (ThirdCondition(string foo, int bar)){
SetResult(foo, bar);
}
//etc...
Obviously this works great when the user wants to enter a single record but when processing large excel files it either:
A: Times out on the server. B: Leaves the user staring at a screen for a while.
What is a more effective way to deal with bulk processing large amounts of data from an excel file, where the records will need to be their own entity in the database?
Upvotes: 0
Views: 939
Reputation: 186
Try to keep it as last option. Because SqlBulkCopy
belongs to some older versions of .net and may be there are some better things available now.
- Do the Bulk Import for all the records of excel sheet in some table. So you can use SqlBulkCopy.
- Create a Stored proc and based upon the conditions, use the Insert/Update in one shot.
The above approach in Stored proc will be faster as comparing to Linq operations in code behind.
A: Times out on the server. B: Leaves the user staring at a screen for a while.
Do it asynchronously.
class ThreadTest
{
public ActionResult Main()
{
Thread t = new Thread (WriteY);
t.Start();
return View();
}
void WriteY()
{
}
}
For TimeOut
sqlcommand.CommandTimeout = 0
will set it to infinite
Upvotes: 1