Reputation: 84
I don't know, if this is correct way of doing this, but here is the scenario. We have multiple servers in multiple locations and each one has a mysql db on it. Each db structure is exactly the same. The goal is to get one or two tables from each database onto one single location.
I don't want to run a bunch of crons on a bunch of different servers. I want to run all the actions on one server namely the server where the consolidated database is living on.
So to dump a table via bash is easy using ssh and mysqldump. The Challenge here is that the data I need does not live in a table but rather in a View. So getting the data requires to do a mysql -e instead of a mysqldump.
But this method dumps the data into a file as data and not as a importable .sql file.
Any guidance and ideas on the above will be fantastic.
Thanks
UPDATE:
Ok so the goal is to run the above command remotely from the consolidation server and getting the file locally, The proccess can then import the data from the view into a normal table with the same structure as the view.
Upvotes: 1
Views: 80
Reputation: 1135
You can dump the data in the view by using SELECT * INTO OUTFILE. e.g.
## Exporting data
SELECT * INTO OUTFILE '/tmp/my_out.text'
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
FROM m_view;
## Creating migration table
CREATE TABLE m_m_test LIKE migrate_table;
## Loading data from file
LOAD DATA INFILE '/tmp/my_out.text' INTO TABLE m_m_test
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
Source: https://www.pythian.com/blog/can-just-dump-data-view-using-mysqldump/
Upvotes: 2