Basti G.
Basti G.

Reputation: 441

celery worker crash when i start another

i use django with celery and redis. I would like to have three queues und three workers.

My celery settings in the settings.py looks like this:

CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/Berlin'

# CELERY QUEUES SETUP
CELERY_DEFAULT_QUEUE = 'default'
CELERY_DEFAULT_ROUTING_KEY = 'default'
CELERY_TASK_QUEUES = (
    Queue('default', Exchange('default'), routing_key='default'),
    Queue('manually_crawl', Exchange('manually_crawl'), routing_key='manually_crawl'),
    Queue('periodically_crawl', Exchange('periodically_crawl'), routing_key='periodically_crawl'),
)
CELERY_ROUTES = {
    'api.tasks.crawl_manually': {'queue': 'manually_crawl', 'routing_key': 'manually_crawl',},
    'api.tasks.crawl_periodically': {'queue': 'periodically_crawl', 'routing_key': 'periodically_crawl',},
    'api.tasks.crawl_firsttime': {'queue': 'default', 'routing_key': 'default',},
}

Later i will start the workers with celery multi, but in the development phase i would like to start the worker manually to see errors or so.

I start the redis server with redis-server and than i start the first worker default with:

celery -A proj worker -Q default -l debug -n default_worker

If i try to start the next worker in a new terminal with:

celery -A proj worker -Q manually_crawl -l debug -n manually_crawl

I get an error in the first default worker terminal:

[2019-10-28 09:32:58,284: INFO/MainProcess] sync with celery@manually_crawl
[2019-10-28 09:32:58,290: ERROR/MainProcess] Control command error: OperationalError("\nCannot route message for exchange 'reply.celery.pidbox': Table empty or key no longer exists.\nProbably the key ('_kombu.binding.reply.celery.pidbox') has been removed from the Redis database.\n")
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/kombu/connection.py", line 439, in _reraise_as_library_errors
    yield
  File "/usr/local/lib/python3.7/dist-packages/kombu/connection.py", line 518, in _ensured
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/kombu/messaging.py", line 203, in _publish
    mandatory=mandatory, immediate=immediate,
  File "/usr/local/lib/python3.7/dist-packages/kombu/transport/virtual/base.py", line 605, in basic_publish
    message, exchange, routing_key, **kwargs
  File "/usr/local/lib/python3.7/dist-packages/kombu/transport/virtual/exchange.py", line 70, in deliver
    for queue in _lookup(exchange, routing_key):
  File "/usr/local/lib/python3.7/dist-packages/kombu/transport/redis.py", line 877, in _lookup
    exchange, redis_key))
kombu.exceptions.InconsistencyError:
Cannot route message for exchange 'reply.celery.pidbox': Table empty or key no longer exists.
Probably the key ('_kombu.binding.reply.celery.pidbox') has been removed from the Redis database.


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/celery/worker/pidbox.py", line 46, in on_message
    self.node.handle_message(body, message)
  File "/usr/local/lib/python3.7/dist-packages/kombu/pidbox.py", line 145, in handle_message
    return self.dispatch(**body)
  File "/usr/local/lib/python3.7/dist-packages/kombu/pidbox.py", line 115, in dispatch
    ticket=ticket)
  File "/usr/local/lib/python3.7/dist-packages/kombu/pidbox.py", line 151, in reply
    serializer=self.mailbox.serializer)
  File "/usr/local/lib/python3.7/dist-packages/kombu/pidbox.py", line 285, in _publish_reply
    **opts
  File "/usr/local/lib/python3.7/dist-packages/kombu/messaging.py", line 181, in publish
    exchange_name, declare,
  File "/usr/local/lib/python3.7/dist-packages/kombu/connection.py", line 551, in _ensured
    errback and errback(exc, 0)
  File "/usr/lib/python3.7/contextlib.py", line 130, in __exit__
    self.gen.throw(type, value, traceback)
  File "/usr/local/lib/python3.7/dist-packages/kombu/connection.py", line 444, in _reraise_as_library_errors
    sys.exc_info()[2])
  File "/usr/local/lib/python3.7/dist-packages/vine/five.py", line 194, in reraise
    raise value.with_traceback(tb)
  File "/usr/local/lib/python3.7/dist-packages/kombu/connection.py", line 439, in _reraise_as_library_errors
    yield
  File "/usr/local/lib/python3.7/dist-packages/kombu/connection.py", line 518, in _ensured
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.7/dist-packages/kombu/messaging.py", line 203, in _publish
    mandatory=mandatory, immediate=immediate,
  File "/usr/local/lib/python3.7/dist-packages/kombu/transport/virtual/base.py", line 605, in basic_publish
    message, exchange, routing_key, **kwargs
  File "/usr/local/lib/python3.7/dist-packages/kombu/transport/virtual/exchange.py", line 70, in deliver
    for queue in _lookup(exchange, routing_key):
  File "/usr/local/lib/python3.7/dist-packages/kombu/transport/redis.py", line 877, in _lookup
    exchange, redis_key))
kombu.exceptions.OperationalError:
Cannot route message for exchange 'reply.celery.pidbox': Table empty or key no longer exists.
Probably the key ('_kombu.binding.reply.celery.pidbox') has been removed from the Redis database.

Why?

Upvotes: 0

Views: 2101

Answers (2)

Amir_Aryan
Amir_Aryan

Reputation: 1

You can start multiple workers like shown below:

$ celery -A proj worker -l info --concurrency=4 -n wkr1@hostname
$ celery -A proj worker -l info --concurrency=2 -n wkr2@hostname
$ celery -A proj worker -l info --concurrency=2 -n wkr3@hostname

In the above example, there are three workers which will be able to spawn 4,2,2 child processes. It is normally advised to run a single worker per machine and the concurrency value will define how many processes will run in parallel.

The default number of those processes is equal to a number of cores on that machine, normally.

I hope this could help you.

Upvotes: 0

Pascal de Sélys
Pascal de Sélys

Reputation: 137

there is a problem currently with the kombu library, according to this post by downgrading to 4.6.4 and for some people 4.6.3 it solves the problem

jorijinnall commented 11 days ago
Had the same issue.
I fixed by downgrading kombu from 4.6.5 to 4.6.3
I still had the bug in version 4.6.4

link github

Upvotes: 1

Related Questions