Homunculus Reticulli
Homunculus Reticulli

Reputation: 68466

Cronjobs run differently (correctly) when command typed at console

I have encountered a strange problem with crontab: I have correct entries in crontab, have set my path correctly, and yet still when the jobs are run through crontab, they run differently (e.g. log files are not produced, and data is not written to file). However, when I type the EXACT commands at the command line, the scripts work as intended.

This is not likely a permissions problem, as I am running the cronjobs as myself.

Here are the relevant details:

Crontab entries:

0,30 8-18 * * 1-5 python /home/rascal/work/skunkworks/scripts/python/downloaders/make_whoopy.py

0,30 7-18 * * 1-5 /home/rascal/work/skunkworks/scripts/python/scrapers/funny_quotes.sh

0,30 7-20 * * 1-5 /home/rascal/work/skunkworks/scripts/python/scrapers/even_funnier_quotes.sh

/var/log/syslog entries (filtered for CRON)

Nov  1 09:00:01 BIGBERTHA CRON[8998]: (rascal) CMD (/home/rascal/work/skunkworks/scripts/python/scrapers/funny_quotes.sh)
Nov  1 09:00:01 BIGBERTHA CRON[8999]: (rascal) CMD (/home/rascal/work/skunkworks/scripts/python/scrapers/even_funnier_quotes.sh)
Nov  1 09:00:01 BIGBERTHA CRON[9004]: (rascal) CMD (python /home/rascal/work/skunkworks/scripts/python/downloaders/make_whoopy.py)
Nov  1 09:09:01 BIGBERTHA CRON[9306]: (root) CMD (  [ -x /usr/lib/php/sessionclean ] && /usr/lib/php/sessionclean)
Nov  1 09:09:01 BIGBERTHA CRON[9313]: (root) CMD (  [ -x /usr/lib/php5/maxlifetime ] && [ -x /usr/lib/php5/sessionclean ] && [ -d /var/lib/php5 ] && /usr/lib/php5/sessionclean /var/lib/php5 $(/usr/lib/php5/maxlifetime))
Nov  1 09:17:01 BIGBERTHA CRON[9495]: (root) CMD (   cd / && run-parts --report /etc/cron.hourly)

Environment (on Server machine running cron jobs)

rascal@BIGBERTHA:~/work/skunkworks/scripts/python/scrapers$ which python
/usr/bin/python
rascal@BIGBERTHA:~/work/skunkworks/scripts/python/scrapers$ which bash
/bin/bash
rascal@BIGBERTHA:~$ which scrapy
/usr/local/bin/scrapy

funny_quotes.sh

#!/bin/bash

. /home/rascal/.profile

cd /home/rascal/work/skunkworks/scripts/python/scrapers

scrapy crawl funny_quotes

Running in command prompt ...

rascal@BIGBERTHA:~$ /home/rascal/work/skunkworks/scripts/python/scrapers/funny_quotes.sh 
2017-11-01 09:57:05 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: scrapers)
2017-11-01 09:57:05 [scrapy.utils.log] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'scrapers.spiders', 'SPIDER_MODULES': ['scrapers.spiders'], 'BO
T_NAME': 'scrapers'}
2017-11-01 09:57:05 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.memusage.MemoryUsage',
 'scrapy.extensions.logstats.LogStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.corestats.CoreStats']
...

Why are jobs being run differently through cron?

Upvotes: 0

Views: 214

Answers (3)

Homunculus Reticulli
Homunculus Reticulli

Reputation: 68466

It turns out that this was related to an environment issue. Cron reported to syslog that it had run the job, yet no files/log files were created.

Eventually, the way I solved this was to add a MAILTO=username to my crontab. After I had done this, when the job run and failed, I saw the error message:

scrapy: command not found

I then resolved this by adding the following statment to my bash script:

PATH=$PATH:/usr/local/bin/

Upvotes: 1

Joao  Vitorino
Joao Vitorino

Reputation: 3256

You can create a custom log for your cron.

0,30 8-18 * * 1-5 python /home/.../make_whoopy.py /home/make_whoopy.py.log 2>&1

This will send stderr and stdout to make_whoopy.py.log

If you want to see the output in screen (dirty solution)

0,30 8-18 * * 1-5 python /home/.../make_whoopy.py > /dev/pts/1

Of course this will work when you are logged and have to change /dev/pts/1 accordingly to your tty

Another alternative is send to mail

0,30 8-18 * * 1-5 python /home/.../make_whoopy.py > /dev/null 2>&1 | mail -s "cron output" [email protected]

And the cherry of the cake is send the output to syslog

0,30 8-18 * * 1-5 python /home/.../make_whoopy.py | /usr/bin/logger -t make_whoopy

The -t make_whoopy will create a tag named make_whoopy on syslog.

Upvotes: 0

Umair Ayub
Umair Ayub

Reputation: 21271

1) According to my knowledge, don't we do bash funny_quotes.sh instead of just funny_quotes.sh

2) Change your cron commands from

python /home/rascal/work/skunkworks/scripts/python/downloaders/make_whoopy.py

to

cd /home/rascal/work/skunkworks/scripts/python/downloaders/ && python make_whoopy.py

and same this one

/home/rascal/work/skunkworks/scripts/python/scrapers/funny_quotes.sh

to

cd /home/rascal/work/skunkworks/scripts/python/scrapers/ && bash funny_quotes.sh

3) Check your cron logs

grep CRON /var/log/syslog

I am sure you are getting error something like scrapy - command not found or something similar.

To fix it, do this

Enter and copy the output of echo $PATH from shell.

And then open crontab -e

At the very top of file, write PATH=YOUR_COPIED_CONTENTS

And that should work.

Upvotes: 0

Related Questions