user2858578
user2858578

Reputation: 17

Csv-files to Sql-database c#

What is the best approach to store information gathered locally in .csv-files with a C#.net sql-database? My reasons for asking is 1: The data i am to handle is massive (millions of rows in each csv). 2: The data is extremely precise since it describes measurements on a nanoscopic scale, and is therefor delicate.

My first though was to store each row of the csv in a correspondant row in the database. I did this using The DataTable.cs-class. When done, i feelt that if something goes wrong when parsing the .csv-file, i would never notice.

My second though is to upload the .csvfiles to a database in it's .csv-format and later parse the file from the database to the local enviroment when the user asks for it. If even possible in c#.net with visual stuido 2013, how could this be done in a efficient and secure manner?

Upvotes: 0

Views: 1303

Answers (3)

pseudocoder
pseudocoder

Reputation: 4392

It sounds like you are simply asking whether you should store a copy of the source CSV in the database, so if there was an import error you can check to see what happened after the fact.

In my opinion, this is probably not a great idea. It immediately makes me ask, how would you know that an error had occurred? You certainly shouldn't rely on humans noticing the mistake so you must develop a way to programmatically check for errors. If you have an automated error checking method you should apply that method when the import occurs and avoid the error in the first place. Do you see the circular logic here?

Maybe I'm missing something but I don't see the benefit of storing the CSV.

Upvotes: 1

Pidon
Pidon

Reputation: 275

I used .Net DataStreams library from csv reader in my project. It uses the SqlBulkCopy class, though it is not free.

Example:

    using (CsvDataReader csvData = new CsvDataReader(path, ',', Encoding.UTF8))
    {
        // will read in first record as a header row and
        // name columns based on the values in the header row
        csvData.Settings.HasHeaders = true;

        csvData.Columns.Add("nvarchar");
        csvData.Columns.Add("float"); // etc.

        using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
        {
            bulkCopy.DestinationTableName = "DestinationTable";
            bulkCopy.BulkCopyTimeout = 3600;

            // Optionally, you can declare columnmappings using the bulkCopy.ColumnMappings property

            bulkCopy.WriteToServer(csvData);
        }
    }

Upvotes: 1

dmigo
dmigo

Reputation: 3029

You should probably use Bulk Insert. With your csv-file as a source. But this will only work if the file is accessible from the PC that is running your SQL Server.

Here you can find a nice solution as well. To be short it looks like this:

StreamReader file = new StreamReader(bulk_data_filename);  
CsvReader csv = new CsvReader(file, true,',');  
SqlBulkCopy copy = new SqlBulkCopy(conn);  
copy.DestinationTableName = tablename;  
copy.WriteToServer(csv);  

Upvotes: 0

Related Questions