Reputation: 1106
I have an sql file that contains 1.74+ Million records, this is not your average database import.
I lifted the upload minimum to 128MB from the 8MB limit as well as the memory limit to 1GB, should it need it. Only 251 of the records ever get processed before I get an out of memory error on a machine that has 1GB RAM at its disposal, this makes no sense at all. Despite the claims to be importing records, the web host like my local host, stalls. The uncompressed file is 230MB, compressed its 36MB.
I tried to write a routine for my local server and I find that I am not able to access or insert data, I get an Access denied error (42000/1044) and when I look it up, its about access and needing root privis...
Any suggestions on how to get either the file split properly or imported via a script because PHPMyAdmin has I am afraid to say, lost the plot. Its not helping much and much of the documentation assumes you have physical access to systems, so I can't find anything helpful in the doc's, I keep on going around in circles... HELP!
Upvotes: 0
Views: 926
Reputation: 4747
I recently found an answer about this in SO (here). I will try to search it again to provide its URL. This is the code i used to accomplish the same task.
ini_set('max_execution_time', 0);
set_time_limit(0);
ini_set('upload_max_filesize', '1000M');
// Temporary variable, used to store current query
$templine = '';
$filename = 'filename.sql';
// Read in entire file
$lines = file($filename);
// Loop through each line
foreach ($lines as $line) {
// Skip it if it's a comment
if (substr($line, 0, 2) == '--' || $line == '') {
continue;
}
// Add this line to the current segment
$templine .= $line;
// If it has a semicolon at the end, it's the end of the query
if (substr(trim($line), -1, 1) == ';') {
// Perform the query
mysqli_query($connection, $templine) or print('Error performing query \'<strong>' . $templine . '\': ' . mysqli_error($connection) . '<br /><br />');
// Reset temp variable to empty
$templine = '';
}
}
echo "Tables imported successfully";
Upvotes: 0
Reputation: 1005
Not really a good idea to use a browser based UI for large records.
1) If it is an SQL file try using mysql command line
mysql -uuser -ppassword db_name < file.sql
2) If it is a CSV file use mysqlimport, the table name should match the filename (without extension), and you can define the delimiters and whether to ignore 1st row (if first row is headers).
3) Or try out HeidiSQL, but I found command-line works best, even on Windows.
4) If your choice is only the installed phpmyadmin and you cannot upload any scripts such as Adminer to the remote server, then one of the limited choice you are left is to split SQL file/queries and process them. An example of correct and wrong splitting is given below:
bigfile.sql
Insert into tablex values ('A','AA'),
('B','BB'),('C','CC'),('D','DD');
Splits to
file1.sql
Insert into tablex values ('A','AA'), ('B','BB');
files2.sql
Insert into tablex values ('C','CC'), ('D','DD');
You should NOT split it like below... it will fail...
file1.sql
Insert into tablex values ('A','AA'),
files2.sql
('B','BB'), ('C','CC'), ('D','DD');
Upvotes: 2