Reputation: 645
I have data in data log file in the following format.
Each line represent a row in the data base and there are 500-1000 rows to be inserted at a time.
Is is better to insert the data in the database directly by calling store procedure(the procedure contain the logic to call log file and insert data)
OR
Is it better to parse it in the application and insert data.
29 2011-02-01 11:00:40 1 0 1 0
30 2011-02-01 11:00:44 1 0 1 0
32 2011-02-01 11:00:49 1 0 1 0
Upvotes: 3
Views: 385
Reputation: 11
You could alternatively use BCP (Bulk copy) command.
Import this log file in excel and save it as csv with separate column names. Use the following syntax to import into tables in cmd prompt.
go to $\Program Files\Microsoft SQL Server\100\Tools\Binn in cmd prompt
bcp in TableName "D:\logs\log.csv" -c -S Servername -U username -P password -t "," -r "\n"
you can alternatively specify -n instead of -c... -t :- comma seperated, -r :- Row terminator
Upvotes: 1
Reputation: 13931
This is probably .tsv format (tab separated values).
LOAD DATA LOCAL INFILE 'uniq.csv' INTO TABLE tbl
FIELDS TERMINATED BY '\t'
ENCLOSED BY ''
LINES TERMINATED BY '\n'
(column1, column2, column3)
\t is tab character. Change it for space if it doesnt work.
Manual processing row after row will be not efficient, because hard disk will be accessed many times when you try to read row by row. LOAD command should read whole file at once.
You should remember, that indexes can seriously slow down inserts like this. If you need to read a lot of data (100000 rows for example) - sometimes its better to drop index, insert, and re-create index after inserting.
Upvotes: 1
Reputation: 521
It is better (for performance) to insert data in bulk (using the LOAD FROM, or transactions. I am assuming store does something similar), than to insert them one at a time with a query per row.
Upvotes: 0