Tarek
Tarek

Reputation: 3798

Larg set of csv files need to be inserted into a database

I was requested to do a csv -> mysql insert on CSV Files .

They have 20k files of csv that contains from 100 to 100k lines of data .

So here is what i thought i can do :

On my machine i could join all files into 1 using a simple command :

 copy *.csv  combined.csv 

and then process that file and then using php fgetcsv process that file and insert it in the DB .

Or

I could write a php program that read a directory process each file and when done transfer it into another folder .

Dedicated server is available

Good Pc with 8gb of memory available

What do you suggest ?

Upvotes: 0

Views: 167

Answers (2)

Shiplu Mokaddim
Shiplu Mokaddim

Reputation: 57650

Use LOAD DATA INFILE. Its better to insert data file by file.

LOAD DATA INFILE '/path/to/data.csv' 
INTO TABLE `table` 
FIELDS TERMINATED BY ',' 
LINES TERMINATED BY '\n' 

Run in batch mode for 20k files. Put password in command to prevent it from asking.

#!/bin/bash

for f in /path/to/*.csv;
do 
    mysql -uroot -pPASSWORD DATABASE \
    -e "LOAD DATA INFILE '$f' 
        INTO TABLE `table` 
        FIELDS TERMINATED BY ',' 
        LINES TERMINATED BY '\n'"
    echo $f is imported ...
done;

Upvotes: 3

Toby Allen
Toby Allen

Reputation: 11213

It would be a much better idea to do it file by file. This allows for restart if something goes wrong, also, are all the files identical?

You can use a php script to generate sql commands and execute those .

Upvotes: 0

Related Questions