Reputation: 171
I have a database backup command that takes a mysql dump and then uploads that dump file to AWS S3, when I run the command as a normal user it works perfectly but when I use the same command in a cron job it fails.
I have checked the syslog and there is no error message saying there was a problem after the job. There is only a line saying the job is run and then goes on to run the next cron job.
The command is as follows, I have removed the sensitive parts:
mysqldump -u {{ db_user }} -p{{ db_password }} {{ db_name }} > /home/db_backup.sql | aws s3 cp /home/db_backup.sql s3://{{ s3_url }}/$(date --iso-8601=seconds)_db.sql --profile backupprofile
When this command is run by a normal user there is a warning output not to use the the mysql password in command line but this is essential for the command to work without interaction. There is also a second line ofor the S3 to say that the upload worked. Could these outputs be effecting the cronjob is someway?
Upvotes: 0
Views: 132
Reputation: 584
Try checking the the environment variable, cron passes a minimal set of environment variables to your jobs. You can set the PATH easily inside crontab
PATH=/usr/local/bin:/usr/sbin
Also many cron execute command using sh, and you might be using another shell in your script . You can tell cron to run all commands in bash by setting the shell at the top of your crontab:
SHELL=/bin/bash
Cron tries to interpret % symbol, you need to escape it if you have that somewhr in your command.
Upvotes: 1
Reputation: 190
You will need to have full paths on your cronjobs, I see you missed them out on mysqldump
and also your aws
for the connection URL. I would do whereis mysqldump
and whereis aws
to find which full path you need to run it.
Upvotes: 1
Reputation: 584
If the output that comes first after running your command is interactive that is if this asks you to hit enter or something like that thn this is the issue otherwise thr shouldn't be any problem with this.
Upvotes: 0