Reputation: 1322
I have a database exported from an old website, and it's a joomla 1.5 built website.
Now i need to put a website back up but i can't import a database.
The database is in compressed .gz file i know that old server was running on centos(5 i think) and direct admin control panel.
And i used usual default settings to export a database through phpmyadmin.
Now i can't import it on my pc.
I use wampserver and constantly getting error maximum exceeded time of 300 in phpmyadmin/import.php
The database file is 2.5mb compressed and 28mb uncompressed.
I tried every options to split it, divide, copy/paste code into php but the file is just too large to do it.
I changed all the settings on script execution time to unfinite, max upload size, max memory use... but simply phpmyadmin just drops a file after some time.
Any way to split sql file into few smaller files and upload them separately.
Upvotes: 0
Views: 3377
Reputation: 4136
Pozdrav Srba,
There is a tool to do this on platforms that have limits, such as shared hosting:
With this, upload your large dump, then configure BigDump to insert it to your DB.
As for splitting the dump file, it would depend on the nature of the dump. It is like multiple queries, with atermination, I'd imageine: ,.
First you would need to identify the termination of each query, then split it something like:
$con = file_get_contents( 'file.sql' );
$parts = explode( '),(' );
You now have each individual query that you can dump. You'll need to set memory and execution time like:
set_time_limit( 2419200 );
ini_set( 'memory_limit', '999M' );
Upvotes: 2