manxing
manxing

Reputation: 3335

load multiple txt files into mysql

I have more than 40 txt files needed to be loaded into a table in Mysql. Each file contains 3 columns of data, each column lists one specific type of data, but in general the format of each txt file is exactly the same, but these file names are various, first I tried LOAD DATA LOCAL INFILE 'path/*.txt' INTO TABLE xxx"

Cause I think maybe use *.txt can let Mysql load all the txt file in this folder. But it turned out no.

So how can I let Mysql or python do this? Or do I need to merge them into one file manually first, then use LOAD DATA LOCAL INFILE command?

Many thanks!

Upvotes: 1

Views: 2491

Answers (2)

Sébastien
Sébastien

Reputation: 14841

If you want to avoid merging your text files, you can easily "scan" the folder and run the SQL import query for each file:

import os 

for dirpath, dirsInDirpath, filesInDirPath in os.walk("yourFolderContainingTxtFiles"):
    for myfile in filesInDirPath:
        sqlQuery = "LOAD DATA INFILE %s INTO TABLE xxxx (col1,col2,...);" % os.path.join(dirpath, myfile)
        # execute the query here using your mysql connector.
        # I used string formatting to build the query, but you should use the safe placeholders provided by the mysql api instead of %s, to protect against SQL injections

Upvotes: 2

BuZz
BuZz

Reputation: 17495

The only and best way is to merge your data into 1 file. That's fairly easy using Python :

fout=open("out.txt","a")
# first file:
for line in open("file1.txt"):
    fout.write(line)
# now the rest:    
for num in range(2,NB_FILES):
    f = open("file"+str(num)+".txt")
    for line in f:
         fout.write(line)
    f.close() # not really needed
fout.close()

Then run the command you know (... INFILE ...) to load the one file to MySql. Works fine as long as your separation between columns are strictly the same. Tabs are best in my opinion ;)

Upvotes: 0

Related Questions