Reputation: 4434
I am using a SQS
as a 'bridge' in my system, which receives tasks from GAE
and will be processed on the EC2
. Currently, I am able to add tasks to this queue from GAE, but with some difficulties on how to consume these tasks on EC2
. So my questions are:
EC2
, which would keep an eye on SQS
and assign new inbound jobs to works? SQS
monitoring product? If not, is Celery
's period task
a good candidate?Upvotes: 0
Views: 4461
Reputation: 574
You can use the AWS Beanstalk service to consume the tasks in the Queue AWS Beanstalk with SQS
If you don't want to breakdown your code to run within Beanstalk you can write some code for beanstalk to pull an item off the queue and then send it to your ec2 server, essentially making Beanstalk hand out the messages/tasks in the queue. This would remove the need for your ec2 server to constantly poll the queue
Upvotes: 0
Reputation: 349
The task consumer is just a (long) poller. The boto Python library has SQS support. AWS provides the SQS service, but they don't make the consumers. I'm not familiar with Celery.
Standard practice is to poll for message (which marks it as 'invisible'), and then perform action at consumer end. At action completion, delete message. If action fails because one of your compute nodes disappeared the message will become visible after a period of time and it'll be picked up in a future poll. If you need something smarter you may want to implement an external ESB or experiment with AWS SWF.
http://boto.readthedocs.org/en/latest/ref/sqs.html
Upvotes: 1