Reputation: 607
Hello I need to run django celery in production with SQS, but It don't work. I have in my settings these configurations:
BROKER_URL = 'sqs://' + AWS_ACCESS_KEY_ID + ':' + AWS_SECRET_ACCESS_KEY + '@'
BROKER_TRANSPORT = 'sqs'
BROKER_TRANSPORT_OPTIONS = {
'region': 'us-east-1',
'visibility_timeout': 3600,
# 'polling_interval': 0.3,
# 'queue_name_prefix': 'celery-',
}
BROKER_USER = AWS_ACCESS_KEY_ID
BROKER_PASSWORD = AWS_SECRET_ACCESS_KEY
CELERY_DEFAULT_QUEUE = 'mall4g-sqs'
CELERY_QUEUES = {
CELERY_DEFAULT_QUEUE: {
'exchange': CELERY_DEFAULT_QUEUE,
'binding_key': CELERY_DEFAULT_QUEUE,
}
}
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'filters': {
'require_debug_false': {
'()': 'django.utils.log.RequireDebugFalse'
}
},
'handlers': {
'mail_admins': {
'level': 'ERROR',
'filters': ['require_debug_false'],
'class': 'django.utils.log.AdminEmailHandler'
}
},
'loggers': {
'django.request': {
'handlers': ['mail_admins'],
'level': 'ERROR',
'propagate': True,
},
}
}
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ENABLE_UTC = True
CELERYBEAT_SCHEDULE = {
'orders-expired-task': {
'task': 'orders.tasks.orders_expired',
'schedule': timedelta(hours=2)
},
'remember-set-card': {
'task': 'orders.tasks.remember_set_credit_card',
'schedule': timedelta(days=14)
},
'example': {
'task': 'orders.tasks.example',
'schedule': timedelta(minutes=5)
},
}
# needed for worker monitoring
CELERY_SEND_EVENTS = True
# where to store periodic tasks (needed for scheduler)
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
CELERY_TIMEZONE = 'UTC'
I don't know what else add or what other thing do to make run the djcelery tasks. Please help.
Thanks....
Upvotes: 1
Views: 3439
Reputation: 5067
If you are using Celery 4.0.0 and have this line
app.config_from_object('django.conf:settings', namespace='CELERY')
Namespace tells that all Celery related settings should start with CELERY
Use then CELERY_BROKER_URL, CELERY_BROKER_TRANSPORT_OPTIONS.
Upvotes: 2
Reputation: 3783
I've just managed to link up Celery and SQS.
In my settings:
BROKER_URL = 'sqs://'
BROKER_TRANSPORT_OPTIONS = {'region': 'eu-west-1',
'visibility_timeout': 43200,# in seconds
'polling_interval': 3,
'queue_name_prefix':'repricer-stage-',
'CELERY_SEND_TASK_ERROR_EMAILS': True
}
Above, note BROKER_RUL = 'sqs://':
The login credentials can also be set using the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, in that case the broker url may only be sqs://.
Be aware that you need a worker active to interact with SQS. Via the console (in your virtual environment):
$ celery -A proj worker -l info
Upvotes: 0