shchoi
shchoi

Reputation: 23

celery with redis is not working well

I'm developing celery tasks to aggregate social contents from facebook and twitter.

Tasks are as following

'facebook_service_handler' and 'facebook_contents_handler' tasks use facebook open api with urlopen function.

It is working well when urlopen requests is not many. (under 4~5 times) but when the urlopen request is over the 4~5, worker is not working anymore.

also, When the celery is stopped, I break the redis and celeryd, and restart celeryd and redis. last tasks are executed

any body help me about this problem??

I'm working it on the mac os lion.

Upvotes: 0

Views: 1097

Answers (1)

hymloth
hymloth

Reputation: 7035

Ideally, you should have two different queues, one for network I/O (using eventlet, with which you can "raise" more processes) and one for the other tasks (using multiprocessing). If you feel that this is complicated, take a look at CELERYD_TASK_SOFT_TIME_LIMIT. I had similar problems when using urllib.open within celery tasks, as the connection might hang and mess the whole system.

Upvotes: 2

Related Questions