Gihan
Gihan

Reputation: 4283

unable to import large database to docker mysql container

I'm trying to import a large database to a mysql container. I've mounted host directories as volumes for the mysql container. So the data is persistent on host. The importing sql file is 14 GB+. The mysql container becomes unresponsive half way of through importing. When I run docker stats I can see the CPU % usage becomes < 1 once mysql container ate all the memory available. I tried increasing memory of docker up to 10 GB and It creates more tables from import when I allocate more memory to Docker. But I cannot allocate more than 10GB from host.

Following is my docker-compose.yml file

mysql:
    image: mysql:5.6
    environment:
        - MYSQL_ROOT_PASSWORD=12345678
    volumes:
        - ./mysql/lib:/var/lib/mysql
        - ./mysql/conf.d:/etc/mysql/conf.d
        - ./mysql/log:/var/log/mysql
        - /tmp:/tmp
    ports:
        - "3306:3306"

I'm using Docker for mac which has docker version 1.12.1

I was using docker exec -it docker_mysql_1 /bin/bash to login to container and import the sql file from /tmp

Also I tried the way recommended by mysql repo by mounting sql file to /docker-entrypoint-initdb.d. But that also halt the mysql init.

UPDATE 1

$ docker info
Containers: 1
 Running: 0
 Paused: 0
 Stopped: 1
Images: 2
Server Version: 1.12.1
Storage Driver: aufs
 Root Dir: /var/lib/docker/aufs
 Backing Filesystem: extfs
 Dirs: 18
 Dirperm1 Supported: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
 Volume: local
 Network: host bridge null overlay
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Security Options: seccomp
Kernel Version: 4.4.20-moby
Operating System: Alpine Linux v3.4
OSType: linux
Architecture: x86_64
CPUs: 4
Total Memory: 9.744 GiB
Name: moby
ID: 43S4:LA5E:6MTG:IFOG:HHJC:HYLX:LYIT:YU43:QGBQ:K5I5:Z6LP:AENZ
Docker Root Dir: /var/lib/docker
Debug Mode (client): false
Debug Mode (server): true
 File Descriptors: 16
 Goroutines: 27
 System Time: 2016-10-12T07:52:58.516469676Z
 EventsListeners: 1
No Proxy: *.local, 169.254/16
Registry: https://index.docker.io/v1/
Insecure Registries:
 127.0.0.0/8


$ df -h
Filesystem      Size   Used  Avail Capacity iused      ifree %iused  Mounted on
/dev/disk1     233Gi  141Gi   92Gi    61% 2181510 4292785769    0%   /
devfs          193Ki  193Ki    0Bi   100%     668          0  100%   /dev
map -hosts       0Bi    0Bi    0Bi   100%       0          0  100%   /net
map auto_home    0Bi    0Bi    0Bi   100%       0          0  100%   /home
/dev/disk2s2   466Gi   64Gi  401Gi    14%    1857 4294965422    0%   /Volumes/mac
/dev/disk2s3   465Gi   29Gi  436Gi     7%  236633    3575589    6%   /Volumes/PORTABLE
/dev/disk3s1   100Mi   86Mi   14Mi    86%      12 4294967267    0%   /Volumes/Vagrant

I was using /dev/disk1 directories to mount volumes.

Upvotes: 10

Views: 15254

Answers (3)

Nisha Gadhe
Nisha Gadhe

Reputation: 71

I solved phpmyadmin->import of large Database error, by changing Environment variable at docker-compose.yml

UPLOAD_LIMIT=1G


myadmin:
        image: phpmyadmin/phpmyadmin
        container_name: phpmyadmin
        ports:
            - "8083:80"
        environment:
            - UPLOAD_LIMIT=1G
            - PMA_ARBITRARY=1
            - PMA_HOST=${MYSQL_HOST}
        restart: always
        depends_on:
            - mysqldb

Upvotes: 6

tanu
tanu

Reputation: 21

Even I had run into similar problem. Follow the below process, this might help you: Firstly copy your sql file(filename.sql) into the db container.

docker cp filename.sql docker_db_container:/filename.sql

Later login to your db container,and populate db with this(filename.sql) file. In order to insert your file(filename.sql) in the database, Go-to mysql into the db container, use the db in which you want to store the database,ie. use database_name;

source /filename.sql

Yet if you're facing issue wrt to large packet size,then increase containers packet size

docker exec -it docker_db_container bash -c "echo 'max_allowed_packet = 1024M' >> /etc/mysql/mysql.conf.d/mysqld.cnf"

Upvotes: 2

n2o
n2o

Reputation: 6477

I had a similar issue when trying to load a big sql file into my database. I just had to increase the maximum packet size in the container and the import worked as expected. For example, you want to increase the maximum size of your SQL file to 512 MB and your container is named as my_mysql, you can adjust the package size in a running container with this command:

docker exec -it my_mysql bash -c "echo 'max_allowed_packet = 512M' >> /etc/mysql/mysql.conf.d/mysqld.cnf" 

This appends the line to the config file. After this, you need to restart the container:

docker restart my_mysql

Upvotes: 3

Related Questions