101010
101010

Reputation: 15716

How to get Celery to load the config from command line?

I am attempting to use celery worker to load a config file at the command line:

celery worker --config=my_settings_module

This doesn't appear to work. celery worker starts and uses its default settings (which include assuming that there is a RabbitMQ server available at localhost:5672) In my config, I would like to point celery to a different place. When I change the amqp settings in the config file to something, Celery didn't appear to care. It still shows the default RabbitMQ settings.

I also tried something bogus

celery worker --config=this_file_does_not_exist

And Celery once again did not care. The worker started and attached to the default RabbitMQ. It's not even looking at the --config setting

I read about how Celery lazy loads. I'm not sure that has anything to do with this.

How do I get celery worker to honor the --config setting?

Upvotes: 4

Views: 7199

Answers (2)

p2k
p2k

Reputation: 51

I had the exact same error, but I eventually figured out that I had made simple option naming mistakes in the configuration module itself which were not obvious at all.

You see, when you start out and follow the tutorial, you will end up with something that looks like this in your main module:

app = Celery('foo', broker='amqp://user:[email protected]/vsrv', backend='rpc://')

Which works fine, but then later as you add more and more configuration options you decide to move the options to a separate file, at which point you go ahead and just copy+paste and split the options into lines until it looks like this:

Naïve my_settings.py:

broker='amqp://user:[email protected]/vsrv'
backend='rpc://'
result_persistent=True
task_acks_late=True
# ... etc. etc.

And there you just fooled yourself! Because in a settings module the options are called broker_url and result_backend instead of just broker and backend as they would be called in the instantiation above.

Corrected my_settings.py:

broker_url='amqp://user:[email protected]/vsrv'
result_backend='rpc://'
result_persistent=True
task_acks_late=True
# ... etc. etc.

And all of a sudden, your worker boots up just fine with all settings in place.

I hope this will cure a few headaches of fellow celery newbies like us.

Further note:
You can test that celery in fact does not ignore your file by placing a print-statement (or print function call if you're on Py3) into the settings module.

Upvotes: 2

Chillar Anand
Chillar Anand

Reputation: 29514

If you give some invalid module name or a module name which is not in PYTHONPATH, say celery worker --config=invalid_foo, celery will ignore it.

You can verify this by creating a simple config file.

$ celery worker -h

--config=CONFIG       Name of the configuration module

As mentioned in celery worker help, you should pass configuration module. Otherwise it will raise an error.

If you just run

celery worker

it will start worker and its output will be colored.

In the same directory, create a file called c.py with this line.

CELERYD_LOG_COLOR = False

Now run

celery worker --config=c

it will start worker and its output will not be colored.

If you run celery worker --config=c.py, it will raise an error.

celery.utils.imports.NotAPackage: Error: Module 'c.py' doesn't exist, or it's not a valid Python module name.
Did you mean 'c'?

Upvotes: 4

Related Questions