Reputation: 31
I've been working to get celery working within a flask project. The flask project is using the application factory pattern and this has caused issues with app context and circular imports trying to get tasks to work.
I found this answer for a setup and have celery running and it can see and register my tasks and these tasks can get called and do show up in the message queue and with alot of work I can even get them to record a failure (so far revoked -- only) in the redis results backend.
The tasks themselves work without an error and I don't get a failure. I can send the task through with bad data and get an error code.
The celery app is setup to use Rabbitmq as a broker and redis as a result backend. Both of these appear to be working. I can log into Rabbitmq and see the messages enter the queues and see the workers connected to the queue. I can see some results eventually make it to redis
I am not sure what code to include that would be helpful to this issue. I do have some of the logging output that I see as the issue, but have no idea how to troubleshoot beyond. exhibit from worker debug log
-------------- celery@sy-devsophia v4.3.0 (rhubarb)
---- **** -----
--- * *** * -- Linux-3.10.0-1062.el7.x86_64-x86_64-with-redhat-7.6-Maipo 2019-10-14 18:13:20
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: janus:0x7f0f2b715a58
- ** ---------- .> transport: amqp://janus:**@localhost:5672/janus
- ** ---------- .> results: redis://localhost:6379/0
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: ON
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. celery.accumulate
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap
. janus.workers.imports.course_import
. janus.workers.reports.run_published_report
[2019-10-14 18:13:20,356: DEBUG/MainProcess] | Worker: Starting Hub
[2019-10-14 18:13:20,356: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:20,356: DEBUG/MainProcess] | Worker: Starting Pool
[2019-10-14 18:13:20,442: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:20,443: DEBUG/MainProcess] | Worker: Starting Consumer
[2019-10-14 18:13:20,444: DEBUG/MainProcess] | Consumer: Starting Connection
[2019-10-14 18:13:20,501: INFO/MainProcess] Connected to amqp://janus:**@localhost:5672/janus
[2019-10-14 18:13:20,501: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:20,501: DEBUG/MainProcess] | Consumer: Starting Events
[2019-10-14 18:13:20,547: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:20,547: DEBUG/MainProcess] | Consumer: Starting Mingle
[2019-10-14 18:13:20,547: INFO/MainProcess] mingle: searching for neighbors
[2019-10-14 18:13:21,608: INFO/MainProcess] mingle: all alone
[2019-10-14 18:13:21,608: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,608: DEBUG/MainProcess] | Consumer: Starting Tasks
[2019-10-14 18:13:21,615: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,615: DEBUG/MainProcess] | Consumer: Starting Control
[2019-10-14 18:13:21,619: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,619: DEBUG/MainProcess] | Consumer: Starting Gossip
[2019-10-14 18:13:21,624: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,624: DEBUG/MainProcess] | Consumer: Starting Heart
[2019-10-14 18:13:21,626: DEBUG/MainProcess] ^-- substep ok
[2019-10-14 18:13:21,626: DEBUG/MainProcess] | Consumer: Starting event loop
[2019-10-14 18:13:21,626: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2019-10-14 18:13:21,629: INFO/MainProcess] celery@sy-devsophia ready.
[2019-10-14 18:13:51,174: INFO/MainProcess] Received task: janus.workers.reports.run_published_report[fba8f1e0-be99-4800-a9df-0f564383647a]
[2019-10-14 18:13:51,175: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x7f0f28197840> (args:('janus.workers.reports.run_published_report', 'fba8f1e0-be99-4800-a9df-0f564383647a', {'lang': 'py', 'task': 'janus.workers.reports.run_published_report', 'id': 'fba8f1e0-be99-4800-a9df-0f564383647a', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'fba8f1e0-be99-4800-a9df-0f564383647a', 'parent_id': None, 'argsrepr': "('6201',)", 'kwargsrepr': '{}', 'origin': 'gen13297@sy-devsophia', 'reply_to': '9cd089a7-a28c-35a8-ab34-10440a35f5e2', 'correlation_id': 'fba8f1e0-be99-4800-a9df-0f564383647a', 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': False}}, <memory at 0x7f0f2333f1c8>, 'application/json', 'utf-8') kwargs:{})
[2019-10-14 18:13:51,177: DEBUG/MainProcess] | Worker: Closing Hub...
[2019-10-14 18:13:51,177: DEBUG/MainProcess] | Worker: Closing Pool...
[2019-10-14 18:13:51,177: DEBUG/MainProcess] | Worker: Closing Consumer...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Worker: Stopping Consumer...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Connection...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Events...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Mingle...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Tasks...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Control...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Gossip...
[2019-10-14 18:13:51,178: DEBUG/MainProcess] | Consumer: Closing Heart...
[2019-10-14 18:13:51,179: DEBUG/MainProcess] | Consumer: Closing event loop...
[2019-10-14 18:13:51,179: DEBUG/MainProcess] | Consumer: Stopping event loop...
[2019-10-14 18:13:51,179: DEBUG/MainProcess] | Consumer: Stopping Heart...
[2019-10-14 18:13:51,179: DEBUG/MainProcess] | Consumer: Stopping Gossip...
[2019-10-14 18:13:51,186: DEBUG/MainProcess] | Consumer: Stopping Control...
[2019-10-14 18:13:51,188: DEBUG/MainProcess] | Consumer: Stopping Tasks...
[2019-10-14 18:13:51,188: DEBUG/MainProcess] Canceling task consumer...
[2019-10-14 18:13:51,188: DEBUG/MainProcess] | Consumer: Stopping Mingle...
[2019-10-14 18:13:51,189: DEBUG/MainProcess] | Consumer: Stopping Events...
[2019-10-14 18:13:51,189: DEBUG/MainProcess] | Consumer: Stopping Connection...
[2019-10-14 18:13:51,189: DEBUG/MainProcess] | Worker: Stopping Pool...
[2019-10-14 18:13:52,212: DEBUG/MainProcess] result handler: all workers terminated
[2019-10-14 18:13:52,212: DEBUG/MainProcess] | Worker: Stopping Hub...
[2019-10-14 18:13:52,212: DEBUG/MainProcess] | Consumer: Shutdown Heart...
then in the rabbit logs
=INFO REPORT==== 14-Oct-2019::18:21:05 ===
closing AMQP connection <0.15130.5> ([::1]:57688 -> [::1]:5672)
=WARNING REPORT==== 14-Oct-2019::18:21:05 ===
closing AMQP connection <0.15146.5> ([::1]:57690 -> [::1]:5672):
connection_closed_abruptly
Since I can run the tasks directly without errors and the messages appear in the queue show up in the workers and if i cancel the task it does get reported in the redis backend I can't see where the error is other than this messaging disconnect, but I don't get anything other than rabbitmq reporting the client closing the connection, no reasoning for why the worker is juse up and dying when it gets a task.
I assume that I have an issue in my celery setup somewhere but the answer linked above has been the closest I have managed to get celery to work with this application.
Any help in pointing me to where to track down this issue I would appreciated. If there is any setup or code that would help to be seen I can share it. I am not sure what would be helpful within the code at this point without any more error messages to go by.
Upvotes: 0
Views: 409
Reputation: 31
Seemed to of figured out what was wrong. At least my celery setup is now working.
I was using librabbitmq for the amqp transport and changed to pyamqp. once I changed the transport library it started working.
Upvotes: -1