Ofer Helman
Ofer Helman

Reputation: 786

Flask with Celery - workers exit with exitcode1

I have a flask app with celery.
When I am running the worker as follows:

celery -A app.celery worker

I get the following output

 -------------- celery@local-pc v3.1.22 (Cipater)
---- **** -----
--- * ***  * -- Windows-7-6.1.7601-SP1
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         app:0x483e668
- ** ---------- .> transport:   mongodb://localhost:27017/app
- ** ---------- .> results:     mongodb://localhost:27017/app
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery


[2016-03-08 15:52:05,587: WARNING/MainProcess] celery@local-pc ready.
[2016-03-08 15:52:08,855: ERROR/MainProcess] Process 'Worker-8' pid:9720 exited with 'exitcode 1'
[2016-03-08 15:52:08,855: ERROR/MainProcess] Process 'Worker-7' pid:11940 exited with 'exitcode 1'
[2016-03-08 15:52:08,856: ERROR/MainProcess] Process 'Worker-6' pid:13120 exited with 'exitcode 1'
...

It goes endlessly and the CPU raises to 100%.

the relevant configuration is:

CELERY_BROKER_URL = 'mongodb://localhost:27017/app'
CELERY_RESULT_BACKEND = 'mongodb://localhost:27017/'
CELERY_MONGODB_BACKEND_SETTINGS = {
    'database': 'app',
    'taskmeta_collection': 'my_taskmeta_collection',
}
CELERY_IMPORTS = ('app.tasks', )
CELERYD_FORCE_EXEC = True
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

my project structure is:

proj/
    config.py
    app/
        __init__.py
        tasks.py
        views.py

this is how I configured my celery in ___init___.py":

app = Flask(__name__)
app.config.from_object('config')
db = SQLAlchemy(app)


def make_celery(app):
    celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
    celery.conf.update(app.config)
    TaskBase = celery.Task

    class ContextTask(TaskBase):
        abstract = True

        def __call__(self, *args, **kwargs):
            with app.app_context():
                return TaskBase.__call__(self, *args, **kwargs)
    celery.Task = ContextTask
    return celery

celery = make_celery(app)

this is what I have in tasks.py

from app import celery


@celery.task()
def add_together(a, b):
    return a + b

update

When I remove the following line from the config file the worker does not exit

CELERY_IMPORTS = ('app.tasks', )

but I get the following error

Traceback (most recent call last):
  File "d:\python34\lib\site-packages\celery\worker\consumer.py", line 456, in on_task_received
    strategies[name](message, body,
KeyError: 'app.tasks.add_together'

Upvotes: 3

Views: 1530

Answers (1)

John Moutafis
John Moutafis

Reputation: 23134

In the official Celery web site http://docs.celeryproject.org/en/3.1/getting-started/brokers/mongodb.html

States that when using mongoDB as a broker, you are doing it at your own risk.

Using MongoDB

Experimental Status

The MongoDB transport is in need of improvements in many areas and there are several open bugs. Unfortunately we don’t have the resources or funds required to improve the situation, so we’re looking for contributors and partners willing to help.

To test if that is not what generates your problem, try to use redis, or rabbitMQ as brokers.

Upvotes: 1

Related Questions