Rajesh Hatwar
Rajesh Hatwar

Reputation: 1933

How to Import large or moderate size SQL dump into Mysql/MariaDB using docker

I am new to Docker and wanted to create MariaDB container. While bringing up the container I wanted to import an SQL dump which is of 50MB in size. I am using following Docker compose file which is currently taking more than 1hr to import an SQL dump.

services:
  mariadb:
    image: mariadb:10.1
    container_name: mariadb
    volumes:
      - ./mountedVolume/dbvolume:/var/lib/mysql
      - ./webAppDatabase/dctdatabase/appDB.sql:/docker-entrypoint-initdb.d/appDB.sql
    command: --max_allowed_packet=500M      
    environment:
      MYSQL_ROOT_PASSWORD: root123
    ports:
      - "3306:3306"

I have tried using following Mysql settings, before importing SQL dump

SET @OLD_AUTOCOMMIT=@@AUTOCOMMIT, AUTOCOMMIT = 0;
SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS = 0;
SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS = 0;

After importing SQL dump I have resetted the values

SET UNIQUE_CHECKS=@OLD_UNIQUE_CHECKS;
SET AUTOCOMMIT = @OLD_AUTOCOMMIT;
COMMIT;

But above settings didn't help. I have also used the Solution posted on the link but it didn't help.

I have tried by dividing SQL files into smaller chunks and importing it into MariaDB container but it was taking ~same time

I just wanted to import ~50MB of SQL dump in lesser time, I have tried all the possible ways which are available in internet but none of the solution helped me to minimise the time. Please let me know if there is an workaround.

Upvotes: 2

Views: 3478

Answers (1)

Rajesh Hatwar
Rajesh Hatwar

Reputation: 1933

After searching for long time, I have got the solution. hear I will be explaining 3 different ways to Import/Export the Mysql/MariaDB databases with the performance statistics.

Traditional way of Importing SQL dump takes lot of time to import the data because it has to create the Tables first then import the data in sequential order!. But still performance can be improved by separating CREATE/INSERT statement(s) in two different file and executing it. It will only improve a little performance then the Traditional way of Importing.

To improve bit more performance we can go for Importing/Exporting the SQL dump in gz format !!!.

Example:

To Export DB:

mysqldump -u [user] -p [DATA_BASE_NAME] | gzip > [SQL_FILE_DUMP.sql.gz]   

To Import DB:

gunzip < [SQL_FILE_DUMP.sql.gz]  | mysql -u [user] -p[password] [DATA_BASE_NAME]

To boost the Import/Export performance we should go for taking Mysql/MariaDB snapshot of /var/lib/mysql directory!!!.

[Note: Since, I am using InnoDB tables I didn't experiment with other DB engine(s) and I have found the solution in one of the blog]

To Export DB:

docker run --rm --volumes-from mariadb carinamarina/backup backup --source /var/lib/mysql/ --stdout --zip > db-data-volume.tar.gz

To Import DB:

docker run --rm --interactive --volumes-from mariadb carinamarina/backup restore --destination /var/lib/mysql/ --stdin --zip < db-data-volume.tar.gz

Test case Statistics to import ~50mb of Mysql/MariaDB data:

___Method_______________Size_____________Time Taken in minits_____________

Traditional way         50mb                  50 min

gzip / .gz              50mb                  29 min        

/var/lib/mysql/         50mb                  0.2 min or 20 seconds  

[Note: Test cases are executed in Ubuntu 14.04, Intel® Core™ i5-3470 CPU with 4gb of ram. Performance can be improved in High-End system(s)]

Upvotes: 2

Related Questions