tuna
tuna

Reputation: 6351

Python Celery socket.error: [Errno 61] Connection refused

I am using Celery 3.0 and have the configuration file like below.

celeryconfig.py

BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

CELERY_IMPORTS = ("tasks", )
CELERY_TASK_RESULT_EXPIRES = 300

tasks.py

import celery

@celery.task
def function(x,y):
    return x + y

and function.py

from tasks import function

print function.delay(4,4).get()

I run the application with following command

celeryd --loglevel=INFO --config=celeryconfig

Everything is working great till now. I have redis and celery running and getting answers.

But when I run the function command from another file called parallelizer,

I get the socket error,

 socket.error: [Errno 61] Connection refused

My file is like below,

from examples.dummy.tasks import function
print function.delay(4,4).get()

Any ideas ?

Upvotes: 6

Views: 8821

Answers (4)

Sugandh Narola
Sugandh Narola

Reputation: 11

I was facing same issue , and tried almost every solution from opensource , at the end i got solution which works 100% efficiently.

all you need to do is

inside

projects--->>init.py

file you have to add this two lines of code

code:

from __future__ import absolute_import

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

#Django #celery #redis #celeryconnectionerror #connectionrefused

Upvotes: 1

Jinesh
Jinesh

Reputation: 2585

Had the same problem, ended up realizing that rabbitmq and redis processes were stoped.

On mac, if those services were installed via homebrew, then you can verify whether those services are running by running following command on terminal,

brew services list

can restart services by, (if installed via brew)

brew services restart rabbitmq
brew services restart redis

Upvotes: 7

Eric Marcos
Eric Marcos

Reputation: 2707

I had the same problem and the issue was that I missed this code in my project's __init__.py:

from __future__ import absolute_import

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

I hope it will be useful to someone out there...

Upvotes: 9

tuna
tuna

Reputation: 6351

Problem was,

I was running celeryconfig.py under from a different path than my parallelizer.

When I carried the celeryconfig.py to same path with paralellizer it fixed the issue.

Upvotes: 1

Related Questions