user1592380
user1592380

Reputation: 36317

Managing a long running request to Django from Heroku

I am working on a distributed system where I have one app mounted on heroku making requests from a second "API" based on django mounted on an EC2 ubuntu instance. After a lot of confusion yesterday where I was getting a lot of 503 errors:

I checked the EC2 nginx log where I saw:

SIGPIPE: writing to a closed pipe/socket/fd (probably the client disconnected) on request /i/ 

doing a:

$ heroku logs --tail 

gives

←[35m2015-03-15T01:31:24.778005+00:00 heroku[router]:←[0m at=error code=H12 desc="Request timeout" method=POST path="/i/"   host=myapp.herokuapp.com request_id=459d8925-194f-4d85-a6
ed-a1dc90fb01fb fwd="216.**.**.**" dyno=web.1 connect=1ms service=30004ms status=503 bytes=0

Apparently these H12 errors occur with any heroku request that takes more than 30 seconds (https://devcenter.heroku.com/articles/limits#router). Unfortunately my ec2 django app takes 60 seconds to return a response.

From multiple sources including How to increase Heroku 30s h12 timeout, it appears that I would need to modify my code , so that the heroku request triggers a background process on the ec2 django app and then returns an immediate response. I have not worked with asynchronous tasks in python/django before. What is the simplest/best approach to dealing with this.

Upvotes: 0

Views: 1995

Answers (1)

arosenber
arosenber

Reputation: 126

Ha you've asked a pretty open ended question...Rather than rewrite a bunch of things that have already been written, let me try to point you in the right direction.s

A good standard for this is celery when using Django/Python. See https://devcenter.heroku.com/articles/celery-heroku#deploying-on-heroku for heroku details on the use. Once you get the initial setup, it's as simple as decorating your async tasks with "@task"and calling them with ".delay()".

You will need a "backing store" that is used to manage the tasks. The heroku guide runs you through redis. If you've got a database that's underutilized and your rate of async tasks is very small, then you may want to run that in production. Otherwise I found the cheapest solution without performance worries has been Redis.

If you're really light on your app consider using Honcho (see http://www.radekdostal.com/content/heroku-running-multiple-python-processes-single-dyno-using-honcho)

Upvotes: 1

Related Questions