Reputation: 70125
On my Express/Node server, I set the content type to text/event-stream
:
res.writeHead(200, {
'Content-Type': 'text/event-stream'
});
Then, as a series of callbacks fire, I write data messages to the stream and follow it with two new lines:
res.write('data: ' + JSON.stringify(data) + '\n\n');
If I add logging on the server side or if I just hit the URL with curl
, I can see that data messages are being written over a few seconds.
However, when I try to use these data messages in a web page, nothing happens. (I'm testing on Chrome, Firefox, and Safari all on a Mac.) Here's what the Web page looks like:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<title>Testing</title>
</head>
<body>
<h1>Server-Sent Events Test</h1>
<script>
var source = new EventSource('/library/search?q=medicine&async');
source.onmessage = function(e) {
document.body.innerHTML += JSON.parse(e.data).name + '<br>';
};
</script>
</body>
</html>
If I add a final callback on the server side that closes the connection (using res.end()
) then the browsers respond to the data messages all at once, and only once res.end()
has happened. Which would seem to defeat the purpose of using Server-Sent Events.
What do I need to change (short of giving up and switching to XHR polling) to have the browsers respond to the Server-Sent Events as they arrive (which would seem to be exactly the purpose and use case for Server-Sent Events)?
(Test page demonstrating the problem was available but now that this problem has been resolved, I've removed it.)
Upvotes: 6
Views: 4097
Reputation: 17525
It looks like you have some middleware that is doing compression; and in doing this, it is buffering until you complete the response. You can see this with curl:
First, bare GET:
curl <url>
Next, add an Accept-Encoding
header (similar to what your browser is using):
curl <url> -H 'Accept-Encoding: gzip,deflate,sdch' --compressed
Note that --compressed
just tells curl
to decompress it for you.
You'll notice that you see the expected behavior on the first one, but not the second. This makes it clear that it is related to the compression. I suggest turning off compression for that route, or finding a smarter middleware that knows how to compress each frame.
Upvotes: 8
Reputation: 106726
It works fine for me in Chrome. Here's my test code:
sse.js:
var app = require('express')();
app.get('/', function(req, res) {
res.sendFile(__dirname + '/sse.htm');
});
app.get('/events', function(req, res) {
var counter = 0;
res.writeHead(200, { 'Content-Type': 'text/event-stream' });
setInterval(function() {
res.write('data: ' + JSON.stringify({name: 'foo' + (counter++) }) + '\n\n');
}, 1000);
res.write('data: ' + JSON.stringify({name: 'foo' + (counter++) }) + '\n\n');
});
app.listen(8000);
sse.htm:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
</head>
<body>
<h1>Server-Sent Events Test</h1>
<script>
var source = new EventSource('/events');
source.onmessage = function(e) {
document.body.innerHTML += JSON.parse(e.data).name + '<br />';
};
source.onerror = function(e) {
source.close();
};
</script>
</body>
</html>
this produces the following output:
foo0 // one second later foo1 // two seconds later foo2 // three seconds later foo3 // etc.
Upvotes: 2