Lukas Mayer
Lukas Mayer

Reputation: 1317

EventSource / Server-Sent Events through Nginx

On server-side using Sinatra with a stream block.

get '/stream', :provides => 'text/event-stream' do
  stream :keep_open do |out|
    connections << out
    out.callback { connections.delete(out) }
  end
end

On client side:

var es = new EventSource('/stream');
es.onmessage = function(e) { $('#chat').append(e.data + "\n") };

When i using app directly, via http://localhost:9292/, everything works perfect. The connection is persistent and all messages are passed to all clients.

However when it goes through Nginx, http://chat.dev, the connection are dropped and a reconnection fires every second or so.

Nginx setup looks ok to me:

upstream chat_dev_upstream {
  server 127.0.0.1:9292;
}

server {
  listen       80;
  server_name  chat.dev;

  location / {
    proxy_pass http://chat_dev_upstream;
    proxy_buffering off;
    proxy_cache off;
    proxy_set_header Host $host;
  }
}

Tried keepalive 1024 in upstream section as well as proxy_set_header Connection keep-alive;in location.

Nothing helps :(

No persistent connections and messages not passed to any clients.

Upvotes: 130

Views: 79976

Answers (5)

Chris
Chris

Reputation: 18892

In Linux:

sudo nano /etc/nginx/sites-available/default

Then, in all applicable locations (I had this same location in default file twice so I added to both) add proxy_buffering off; like this:

location /api {
        proxy_pass http://localhost:3332;
        include proxy_params;
        proxy_buffering off;
}

Also - you may want your mimetype to be an event stream. In Flask when returning like so:

return Response(stream_with_context(generateTestStream()), mimetype="text/event-stream")

Upvotes: 0

MarcFasel
MarcFasel

Reputation: 1158

Hi elevating this comment from Did to an answer: this is the only thing I needed to add when streaming from Django using HttpStreamingResponse through Nginx. All the other switches above didn't help, but this header did.

Having the server respond with a "X-Accel-Buffering: no" header helps a lot! (see: wiki.nginx.org/X-accel#X-Accel-Buffering) – Did Jul 1, 2013 at 16:24

Upvotes: 6

E1.
E1.

Reputation: 515

Another option is to include in your response a 'X-Accel-Buffering' header with value 'no'. Nginx treats it specially, see http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_buffering

Upvotes: 47

Martin Konecny
Martin Konecny

Reputation: 59681

Don't write this from scratch yourself. Nginx is a wonderful evented server and has modules that will handle SSE for you without any performance degradation of your upstream server.

Check out https://github.com/wandenberg/nginx-push-stream-module

The way it works is the subscriber (browser using SSE) connects to Nginx, and the connection stops there. The publisher (your server behind Nginx) will send a POST to Nginx at a corresponding route and in that moment Nginx will immediately forward to the waiting EventSource listener in the browser.

This method is much more scalable than having your ruby webserver handle these "long-polling" SSE connections.

Upvotes: 15

user904990
user904990

Reputation:

Your Nginx config is correct, you just miss few lines.

Here is a "magic trio" making EventSource working through Nginx:

proxy_set_header Connection '';
proxy_http_version 1.1;
chunked_transfer_encoding off;

Place them into location section and it should work.

You may also need to add

proxy_buffering off;
proxy_cache off;

That's not an official way of doing it.

I ended up with this by "trial and errors" + "googling" :)

Upvotes: 300

Related Questions