KVISH
KVISH

Reputation: 13208

SQS with Celery configuration

I was trying to setup Amazon SQS for Celery and I have the below configuration:

BROKER_BACKEND = "SQS"
BROKER_TRANSPORT_OPTIONS = {
    'region': 'us-east-1',
}
AWS_ACCESS_KEY_ID = # access id
AWS_SECRET_ACCESS_KEY = # secret access key
os.environ.setdefault("AWS_ACCESS_KEY_ID", AWS_ACCESS_KEY_ID)
os.environ.setdefault("AWS_SECRET_ACCESS_KEY", AWS_SECRET_ACCESS_KEY)

BROKER_URL = 'sqs://'

CELERY_IMPORTS = ("tasks", )
CELERY_TASK_RESULT_EXPIRES = 300

CELERY_DEFAULT_QUEUE = #queue name
CELERY_DEFAULT_EXCHANGE = CELERY_DEFAULT_QUEUE
CELERY_DEFAULT_EXCHANGE_TYPE = CELERY_DEFAULT_QUEUE
CELERY_DEFAULT_ROUTING_KEY = CELERY_DEFAULT_QUEUE
CELERY_QUEUES = {
    CELERY_DEFAULT_QUEUE: {
        'exchange': CELERY_DEFAULT_QUEUE,
        'binding_key': CELERY_DEFAULT_QUEUE,
    }
}

In my SQS configuration on the AWS account, I have a queue with the name written in CELERY_DEFAULT_QUEUE. When I run this locally, everything works...but for some reason it creates another queue on SQS with the name format <user_id>-celery-pidbox. Something like this: MyUser-MacBook-Pro-local-celery-pidbox.

Is this normal? Why would it be creating another queue when I have a queue created with the name specified? Otherwise, its working, not sure if that other queue is required or I missed something? Any help is appreciated, I could not find this in the docs.

EDIT

Turns out this is normal. For some reason django-celery does this, it creates a queue for each box that you have accessing the queue you want to access. They will fix this in a future release. If somebody knows how to fix this temporarily, please let me know, thanks!

Upvotes: 14

Views: 4074

Answers (3)

Mohd Shoaib
Mohd Shoaib

Reputation: 43

If you want to connect the Celery with SQS so you should create an celery app using the below code

from celery import Celery
def make_celery(app):
    celery = Celery(
        app.import_name,
        broker="sqs://",
        broker_transport_options={
            "queue_name_prefix": "{SERVICE_ENV}-{SERVICE_NAME}-"
        },
    )
    task_base = celery.Task

    class ContextTask(task_base):
        abstract = True

        def __call__(self, *args, **kwargs):
            with app.app_context():
                return task_base.__call__(self, *args, **kwargs)

    celery.Task = ContextTask

    return celery

Using this code you will be able to connect the Celery with SQS.

Upvotes: 1

randomproton
randomproton

Reputation: 41

You need to set these:

 CELERY_ENABLE_REMOTE_CONTROL = False 
 CELERY_SEND_EVENTS = False

To disable that.

Upvotes: 4

Gustavo Rauber
Gustavo Rauber

Reputation: 110

This is actually a good behavior so you can monitor which instances (IPs or local names) are accessing your SQS account. It is just one request, so it won't cost you anything.

Upvotes: 5

Related Questions