Connor Peet
Connor Peet

Reputation: 6265

Import Large SQL File

I am a student conducting some research which involves a sort of data mining. I have several volunteer "node" servers which gather and produce SQL files for me to import on my server and analyze.

The problem is, these are very big files, and I need a way to import them quickly. The network recently expanded, and now there just isn't enough throughput on the hard drive for the MySQL console to import them as they come in. And time is important - there is a deadline for the research to be in, and I want to be actively gathering for as much time as possible beforehand and not have a queue waiting to be inserted.

I am wondering if there is a better way to import very large files - each one weighs in at about 100 MB. I've tried "\. myfile.sql"" but that is incredibly slow. PHPMyAdmin won't take files that big.

Any idea? Thanks!

Upvotes: 3

Views: 661

Answers (2)

Kavin D
Kavin D

Reputation: 1

XAMPP Windows Configuration

  1. mysql.ini - 8m change to 200m(as we need)
  2. navigate in cmd prompt to mysql/bin
  3. place your sql file in mysql/bin folder

cmd: mysql -u username -p databasename < sqlFile.sql

  1. it asks password : default empty pass in local host
  2. wait several min(based up on sql file size)_ .. thats all :)

Upvotes: -1

triclosan
triclosan

Reputation: 5724

have you tried mysql -uYOU -p < myfile.sql ?

UPD:

even mysql -uYOU -p < myfile.sql & if you have short-live remote console session

UPD2:

But most efficient way it's using mysqlimport as PinnyM advised. Assuming name_same_as_table.txt is text file with DOS-style EOLs and tab-separated fields. Count and type of fields must be the same as in destination table.

mysqlimport -uYOU -p --lock-tables --lines-terminated-by="\r\n" --fields-terminated-by="\t" YOUR_DB name_same_as_table.txt

Upvotes: 3

Related Questions