Reputation: 3703
We need to copy and dump a schema (around 70GB) from a MySql community edition on windows server to another machine. Doing a mysqldump
is not ideal for us because this is a production server and the dump
slows it down.
So we thought of copying the data files to the new machine, and simply starting MySql. But...
Some of the actual data is apparently in the ibdata1
file, and the tables that has their data in ibdata1
generate the Table doesn't exist in engine
error.
So we thought we'd copy ibdata1
to the new machine as well. We did that. Now, MySql on the new machine doesn't start.
We tried deleting the ib_logfile0
and ib_logfile1
and starting the service, but still no luck. It doesn't start.
Any idea how to solve this?
Upvotes: 1
Views: 2982
Reputation: 3570
This can be useful for your case:
https://www.percona.com/doc/percona-xtrabackup/2.4/index.html
Percona XtraBackup is an open-source hot backup utility for MySQL - based servers that doesn’t lock your database during the backup.
UPDATE:
I have spent a lot of time (some time ago) trying the same like you and I even got to see my_table
with 'SHOW TABLES' on second server but when I had doing a query SELECT * FROM my_table
I got error. ERROR 1146 (42S02): Table my_table doesn't exist
(that made me crazy). See my old question: strange results when manually database copy to another server
Another idea that occurs to me now is to handle backup by chunks with a cron
job and mysqldump
using WHERE
clause to avoid production slow down
Sorry for my English
Upvotes: 1
Reputation: 562330
As you discovered, you can't just copy datafiles. There are dependencies between individual table files, and the central ibdata1, and the ib_log files. There's also some data in RAM in your MySQL Server process. They must be synchronized, or you will only copy corrupted data.
The only way to safely copy the files is to shut down the MySQL Server service, so all changes to those files are stopped while you are copying them. Obviously, this is worse than making the database slow — it is completely unavailable to applications while it is shut down.
This is why backup tools exist, to copy data safely and get a consistent snapshot of data as of a given moment in time.
You can use mysqldump --single-transaction which should get a consistent view of data as of the moment it started the dump, without locking the database or slowing it down (no more than one extra thread running SELECT * FROM <tablename>...
). This works only if all your tables are InnoDB, so they support transactions. And you must not use any CREATE, ALTER, DROP, RENAME, TRUNCATE, or GRANT/REVOKE commands while running the mysqldump.
Another answer suggested using Percona XtraBackup, but that software does not provide a download for Windows. The closest option is to run XtraBackup in a Docker container on Windows (this is experimental and not a recommended solution for a novice).
Another option that many people use is to set up a replica MySQL instance with MySQL Replication, and use it as the source for backups. Then if the backup slows down the replica instance, that doesn't impact your applications that use the master instance.
Upvotes: 0