Reputation: 305
I want to read data from a text file line by line and insert the data in each line in database.
What I'm thinking of now read line by line and at each line insert to database. What I'm asking here is there a better idea to do so? like what can we do to insert data at once instead of looping line by line and insert the record? I have around 500+ lines in the text file that I need to store each day, so performance is my issue here.
Note that I Need to insert each line as a row in the DB. The data are delimiter by comma so I need to split them and insert them on specific columns.
Any recommendation?
Upvotes: 1
Views: 10344
Reputation: 1014
Imports System
Imports System.IO
Imports System.Collections
Module Module1
Sub Main()
Dim objReader As New StreamReader("c:\test.txt")
Dim sLine As String = ""
Dim arrText As New ArrayList()
Do
sLine = objReader.ReadLine()
If Not sLine Is Nothing Then
arrText.Add(sLine)
End If
Loop Until sLine Is Nothing
objReader.Close()
Using command As New SqlCeCommand("INSERT INTO table(col1) VALUES(@data1)", Con)
command.Parameters.AddWithValue("@data1", "")
For Each sLine In arrText
command.Parameters("@data1").Value = sLine
command.ExecuteNonQuery()
Next
End Using
End Sub
End Module
Upvotes: 3
Reputation: 81
You can bring the text file into a datatable first (bringing into datatable will not be any performance issue as it is memory based and there is no roundrobin trip to the server) and than insert it into database using Bulkcopy feature. I presume that data is to be inserted into SQL Server database. You can use SQLBulkCopy for this here is sample code:
private void BulkInsert()
{
SqlBulkCopy bulkCopy = new SqlBulkCopy("Server=ServerName;Database=test;Trusted_Connection=True;",
SqlBulkCopyOptions.TableLock);
bulkCopy.DestinationTableName = "target_table";
bulkCopy.WriteToServer(Text2Table());
}
private DataTable Text2Table()
{
DataTable dt = new DataTable();
StreamReader sr = new StreamReader(@"c:\test\test.txt");
string input;
while ((inrecord = sr.ReadLine()) != null)
{
string[] values = inrecord.Split(new char[] { '|' });
dr = dt.NewRow();
dr["column1"] = values[0];
dr["column2"] = values[1];
dr["column3"] = values[2];
dr["column4"] = values[3];
dt.Rows.Add(dr);
}
sr.Close();
return dt;
}
Upvotes: 1
Reputation: 2140
The performance problem is due to the insertion to DB normally - not because the reading of the file.
In order to workaround this, i suggest you use strongtyped datatable - As you are doing insertion only, just adding new rows to this .Net datatable, and at the end commit it to DB in one go (use SqlDataAdapter)
About reading the file, i suggest you use existing vb.net library: Microsoft.VisualBasic.FileIo.TextFiledParser (ref: http://msdn.microsoft.com/en-us/library/microsoft.visualbasic.fileio.textfieldparser(v=vs.110).aspx).
Of course, you have another option, instead of using DataTable, generate plain text SQL insertion queries would perform a lot better, the output query would be like:
INSERT INTO tblTarget(Id, Col1, Col2)
Values (1, 'Row1.Val1', 'Row1.Val2'),
(2, 'Row2.Val1', 'Row2.Val2'),
(3, 'Row3.Val1', 'Row3.Val2'),
(4, 'Row4.Val1', 'Row4.Val2'),
...
Hope it helps...
Upvotes: 0
Reputation: 89
I think it is not difficult it is easy.
import java.io.File;
import java.io.FileNotFoundException;
import java.util.Scanner;
public class ScannerReadFile {
public static void main(String[] args) {
FileInputStream fstream = new FileInputStream("textfile.txt");
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
String strLine;
int row =0;
int col;
while ((strLine = br.readLine()) != null) {
Scanner scanner = new Scanner(strLine);
String token="";
while(scanner.hasNext()){
token = scanner.next();
for(int i = 0; i<token.length();i++){
if(token.charAt(i)!=','){
record += token.charAt(i);
}
else{
insert into table values ( record )
record = "";
}
}
row++;
} }
Upvotes: 0