Reputation: 18158
I am trying to make a shell script that creates a mysql dump and then puts it on another computer. I have already set up keyless ssh and sftp. They script below creates the mysql dump file on the local computer when it is run and doesn't throw any errors, however the file "dbdump.db" is never put on the remote computer. If I execute the sftp connection and put command by hand then it works.
contents of mysql_backup.sh
mysqldump --all-databases --master-data > dbdump.db
sftp -b /home/tim [email protected] <<EOF
put dbdump.db
exit
EOF
Upvotes: 1
Views: 6401
Reputation: 1635
Your initial approach is a few characters off working though. You're telling sftp to read it's batch-commands from /home/tim -b /home/tim
. So, if you change this to -b -
, it should read it's batch commands from stdin.
Something along these lines, if -b /home/tim were intended to i.e. change directory remotely, you can add cd /home/tim
to your here document.
mysqldump --all-databases --master-data > dbdump.db
sftp -b - [email protected] <<EOF
put dbdump.db
exit
EOF
Upvotes: 0
Reputation: 115
Please write mput/put command into one file (file_contains_put_command) and try below command.
sftp2 -B file_contains_put_command /home/tim [email protected] >> log_file
Example:
echo binary > sample_file
echo mput dbdump.db >> sample_file
echo quit >> sample_file
sftp2 -B sample_file /home/tim [email protected] >> log_file
Upvotes: 1
Reputation: 47945
Try to use scp that should be easier in your case.
scp dbdump.db [email protected]:/home/tim/dbdump.db
Both sftp and scp are using SSH.
Upvotes: 3