Reputation: 51
I'm working with SQL Server 2014. One of the features of my web app is to upload CSV files, and import the data into a table (called TF
) in my database (called TMPA
).
I have no idea how to do this.
string excelPath = Server.MapPath("~/Files/") + Path.GetFileName(FileUpload2.PostedFile.FileName);
FileUpload1.SaveAs(excelPath);
SqlConnection con = new SqlConnection(@"Data Source=SAMSUNG-PC\SQLEXPRESS;Initial Catalog=TMPA;Persist Security Info=True");
StreamReader sr = new StreamReader(excelPath);
string line = sr.ReadLine();
string[] value = line.Split(',');
DataTable dt = new DataTable();
DataRow row;
foreach (string dc in value)
{
dt.Columns.Add(new DataColumn(dc));
}
while (!sr.EndOfStream)
{
value = sr.ReadLine().Split(',');
if (value.Length == dt.Columns.Count)
{
row = dt.NewRow();
row.ItemArray = value;
dt.Rows.Add(row);
}
}
SqlBulkCopy bc = new SqlBulkCopy(con.ConnectionString, SqlBulkCopyOptions.TableLock);
bc.DestinationTableName = "TF";
bc.BatchSize = dt.Rows.Count;
con.Open();
bc.WriteToServer(dt);
bc.Close();
con.Close();
I tried this code, but it wouldn't work.
PS : TF
has more columns than what the CSV file : some of the columns are computed and should be calculated automatically after each insert ..
Here is the canvas of my CSV file : 4 columns :
IdProduit,Mois,Reel,Budget
IdProduit
is a string, Mois
is a date, Reel
and Budget
are floats.
On the other hand, my SQL Server table looks like this :
|IdProduit|Mois|Reel|Budget|ReelPreviousMois|VarReelBudget|VarReelPrvM|...
|---------|----|----|------|----------------|-------------|-----------|-----
All the other columns should either be null or automatically calculated.
Help me !
Upvotes: 1
Views: 12832
Reputation: 336
I know this is an old question, but for whoever may be interested.
Unless you're sure that your files will never be massive, you should avoid loading the whole batch into memory and sending it all at once to SQL Server as is the case with the DataTable approach and (I think) your accepted answer. You may be risking an out of memory exception on the client side (your file processing server in this case) or, worse still, on SQL Server side. You can avoid that by using SqlBulkCopy class and an implementation of IDataReader interface.
I wrote a package that I think could be of interest in cases such as yours. The code would look like so:
var dataReader = new CsvDataReader("pathToYourCsv",
new List<TypeCode>(4)
{
TypeCode.String, //IdProduit
TypeCode.DateTime, //Mois
TypeCode.Double, //Reel
TypeCode.Double //Budget
});
this.bulkCopyUtility.BulkCopy("tableName", dataReader);
There are also additional configuration options for more complex scenarios (flexible column mapping, additional static column values which are not present in the csv file, value transformation). The package is open sourced (project on Github) and should work on .NET Core and .NET Framework.
As a side comment, SQL Server recovery mode may be important when doing massive SQL imports. Whenever possible, use Simple or Bulk Logged to avoid huge transaction files.
Upvotes: 1
Reputation: 51
I fixed it using this opensource .net library called Filehelpers. Here's the link : http://www.filehelpers.net/
Here's what I did :
<asp:FileUpload ID="FileUpload1" runat="server" />
<asp:Button ID="Button1" OnClick = "UploadF" runat="server" Text="Importer" />
And here's the code behind :
[DelimitedRecord("|")]
public class TBFtable
{
public string IdProduit;
public DateTime Mois;
public float Reel;
public float Budget;
}
protected void UploadF(object sender, EventArgs e)
{
string excelPath = Server.MapPath("~/Files/") + Path.GetFileName(FileUpload1.PostedFile.FileName);
FileUpload1.SaveAs(excelPath);
SqlServerStorage storage = new SqlServerStorage(typeof(TBFtable),ConfigurationManager.ConnectionStrings["bd"].ConnectionString);
storage.InsertSqlCallback = new InsertSqlHandler(GetInsertSqlCust);
TBFtable[] res = CommonEngine.ReadFile(typeof(TBFtable), excelPath) as TBFtable[];
storage.InsertRecords(res);
ScriptManager.RegisterClientScriptBlock(this, this.GetType(), "alertMessage", "alert('Données enregistrées avec succès !')", true);
}
protected string GetInsertSqlCust(object record)
{
TBFtable obj = (TBFtable) record;
return String.Format("INSERT INTO TF (IdProduit, Mois, Reel, Budget ) " + " VALUES ( '{0}' , '{1}' , '{2}' , '{3}' ); ", obj.IdProduit, obj.Mois,obj.Reel, obj.Budget );
}
Upvotes: 2
Reputation: 1500
You're on the right path. Using SqlBulkCopy will provide the best performance when inserting the data into SQL Server. However, as opposed to writing your own Csv parser, I would use the stellar one provided in the .NET Framework via the TextFieldParser class in the Microsoft.VisualBasic assembly. You may need to do some digging to see if SqlBulkCopy allows a partial dataset to be used. I don't believe it does, but you could add the missing columns to your DataTable before sending it to SqlBulkCopy as a workaround.
Upvotes: 1