Reputation: 3523
I have a decent size (2.6g) sql file. I can't get the dump by doing exclusion of tables because this is on production and our hosting people don't want us to do dumps, since this locks the db files. They have replicated the database and provided me with a sql file and I was looking around to see if there was a way to ignore tables so that I can leave out the very large tables. I don't need them since I have local data and they are not the reason I need the backup.
I don't want to import this into another db and do anything with it because the whole purpose of what I am looking for is a quicker way to import the data. It takes forever to import 2.6 gigs.
Going through the file and editing out the tables I don't want would seem to take a long time as well. My assumption is there isn't a way to do this but figured I would post and see.
Upvotes: 6
Views: 8732
Reputation: 12038
You'll need to parse the dump files to grab just the tables you need. Here is a pretty decent script that does this rather well:
http://kedar.nitty-witty.com/blog/mydumpsplitter-extract-tables-from-mysql-dump-shell-script
Upvotes: 3