Reputation: 884
I setup my REST server with express.js. Now I want to add sse to this server. After I implemented this sse package, I get an error. I know that I get this error, when would try to use res.send
twice, but I am not.
ERROR: Error: Can't set headers after they are sent.
at ServerResponse.OutgoingMessage.setHeader (http.js:690:11)
at ServerResponse.header (/home/root/node_modules/express/lib/response.js:718:10)
at ServerResponse.send (/home/root/node_modules/express/lib/response.js:163:12)
at app.get.str (/home/root/.node_app_slot/main.js:1330:25)
at Layer.handle [as handle_request] (/home/root/node_modules/express/lib/router/layer.js:95:5)
at next (/home/root/node_modules/express/lib/router/route.js:131:13)
at sse (/home/root/node_modules/server-sent-events/index.js:35:2)
at Layer.handle [as handle_request] (/home/root/node_modules/express/lib/router/layer.js:95:5)
at next (/home/root/node_modules/express/lib/router/route.js:131:13)
at Route.dispatch (/home/root/node_modules/express/lib/router/route.js:112:3)
Is it possible that I can't use the express methods anymore within the sse function? For example:
app.get('/events', sse, function(req, res) {
res.send('...');
});
Furthermore, I found this solution and this. Is it possible to make sse with the res.write
function or in another way without using another package?
Upvotes: 23
Views: 56175
Reputation: 56875
This adds a complete, runnable example to John's excellent answer and makes a tweak, adding the Connection: keep-alive
header. Also included is a client to read the stream and handle the possibility of multiple chunks arriving at once, which seems to be a characteristic of fetch
.
JSON isn't strictly necessary but is useful to separate the data payload from the SSE metadata.
server.js
:const express = require("express");
const app = express();
app.use(express.static("public"));
app.get("/stream", (req, res) => {
res.writeHead(200, {
"Connection": "keep-alive",
"Cache-Control": "no-cache",
"Content-Type": "text/event-stream",
});
let counter = 0;
const interval = setInterval(() => {
const chunk = JSON.stringify({chunk: counter++});
res.write(`data: ${chunk}\n\n`);
}, 100);
res.on("close", () => {
clearInterval(interval);
res.end();
});
});
const listener = app.listen(process.env.PORT || 3000, () =>
console.log(`Your app is listening on port ${listener.address().port}`)
);
public/index.html
:<!DOCTYPE html>
<html lang="en">
<head><title>SSE POC</title></head>
<body>
<script>
(async () => {
const response = await fetch("/stream", {
headers: {
"Accept": "text/event-stream",
},
});
if (!response.ok) {
throw Error(response.statusText());
}
for (const reader = response.body.getReader(); ; ) {
const {value, done} = await reader.read();
if (done) {
break;
}
const chunk = new TextDecoder().decode(value);
const subChunks = chunk.split(/(?<=})\n\ndata: (?={)/);
for (const subChunk of subChunks) {
const payload = subChunk.replace(/^data: /, "");
document.body.innerText = JSON.parse(payload).chunk;
}
}
})();
</script>
</body>
</html>
After node server.js
, navigate your browser to localhost:3000
. You can also test the stream directly with curl localhost:3000/stream
.
I won't repeat the notes from John's answer, but, in short we set the necessary headers and flush them to begin the connection, then use res.write
to send a chunk of data. Call res.end()
to terminate the connection on the server or listen for res.on("close", ...)
for the client closing the connection.
The client uses fetch
and response.body.getReader()
which can be read with const {value, done} = await reader.read()
and decoded with TextDecoder().decode(value)
.
See also https://masteringjs.io/tutorials/express/server-sent-events
Express 4.18.2, Node 18.16.0, Chrome Version 114.0.5735.110 (Official Build) (64-bit)
Upvotes: 10
Reputation: 1890
I disagree with using Socket.IO to implement basic Server-Sent Events. The browser API is dead simple and the implementation in Express requires only a couple of changes from a normal response route:
app.get('/streaming', (req, res) => {
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('Connection', 'keep-alive');
res.flushHeaders(); // flush the headers to establish SSE with client
let counter = 0;
let interValID = setInterval(() => {
counter++;
if (counter >= 10) {
clearInterval(interValID);
res.end(); // terminates SSE session
return;
}
res.write(`data: ${JSON.stringify({num: counter})}\n\n`); // res.write() instead of res.send()
}, 1000);
// If client closes connection, stop sending events
res.on('close', () => {
console.log('client dropped me');
clearInterval(interValID);
res.end();
});
});
The snippet above uses setInterval() to simulate sending data to the client for 10 seconds, then it ends the connection. The client will receive an error for the lost connection and automatically try to re-establish the connection. To avoid this, you can close the client on error, or have the browser send a specific event message that the client understands means to close gracefully. If the client closes the connection, we can catch the 'close' event to gracefully end the connection on the server and stop sending events.
express: 4.17.1 node: 10.16.3
Upvotes: 65
Reputation: 7046
You can definitely achieve this without other packages.
I wrote a blog post about this, part 1 sets out the basics.
You mustn't close the SSE as that breaks the functionality. The whole point is that it is an open HTTP connection. This allows for new events to be pushed to the client at any point.
Upvotes: 8
Reputation: 2038
New Answer:
Just use socket.io, it's so much easier and better! https://www.npmjs.com/package/socket.io#in-conjunction-with-express
basic setup:
const express = require('express');
const PORT = process.env.PORT || 5000;
const app = express();
const server = require('http').createServer(app);
const io = require('socket.io')(server);
// listen to socket connections
io.on('connection', function(socket){
// get that socket and listen to events
socket.on('chat message', function(msg){
// emit data from the server
io.emit('chat message', msg);
});
});
// Tip: add the `io` reference to the request object through a middleware like so:
app.use(function(request, response, next){
request.io = io;
next();
});
server.listen(PORT);
console.log(`Listening on port ${PORT}...`);
and in any route handler, you can use socket.io:
app.post('/post/:post_id/like/:user_id', function likePost(request, response) {
//...
request.io.emit('action', 'user liked your post');
})
client side:
<script src="/socket.io/socket.io.js"></script>
<script src="https://code.jquery.com/jquery-1.11.1.js"></script>
<script>
$(function () {
var socket = io();
$('form').submit(function(e){
e.preventDefault(); // prevents page reloading
socket.emit('chat message', $('#m').val());
$('#m').val('');
return false;
});
socket.on('chat message', function(msg){
$('#messages').append($('<li>').text(msg));
});
});
</script>
full example: https://socket.io/get-started/chat/
Original Answer:
Someone (user: https://stackoverflow.com/users/451634/benny-neugebauer | from this article: addEventListener on custom object) literally gave me a hint on how to implement this without any other package except express! I have it working!
First, import Node's EventEmitter:
const EventEmitter = require('events');
Then create an instance:
const Stream = new EventEmitter();
Then create a GET route for event streaming:
app.get('/stream', function(request, response){
response.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
Stream.on("push", function(event, data) {
response.write("event: " + String(event) + "\n" + "data: " + JSON.stringify(data) + "\n\n");
});
});
In this GET route, you are writing back that the request is 200 OK, content-type is text/event-stream, no cache, and to keep-alive.
You are also going to call the .on method of your EventEmitter instance, which takes 2 parameters: a string of the event to listen for and a function to handle that event(that function can take as much params as it is given)
Now.... all you have to do to send a server event is to call the .emit method of your EventEmitter instance:
Stream.emit("push", "test", { msg: "admit one" });
The first parameter is a string of the event you want to trigger (make sure that it is the same as the one in the GET route). Every subsequent parameter to the .emit method will be passed to the listener's callback!
That is it!
Since your instance was defined in a scope above your route definitions, you can call the .emit method from any other route:
app.get('/', function(request, response){
Stream.emit("push", "test", { msg: "admit one" });
response.render("welcome.html", {});
});
Thanks to how JavaScript scoping works, you can even pass that EventEmitter instance around to other function, even from other modules:
const someModule = require('./someModule');
app.get('/', function(request, response){
someModule.someMethod(request, Stream)
.then(obj => { return response.json({}) });
});
In someModule:
function someMethod(request, Stream) {
return new Promise((resolve, reject) => {
Stream.emit("push", "test", { data: 'some data' });
return resolve();
})
}
That easy! No other package needed!
Here is a link to Node's EventEmitter Class: https://nodejs.org/api/events.html#events_class_eventemitter
My example:
const EventEmitter = require('events');
const express = require('express');
const app = express();
const Stream = new EventEmitter(); // my event emitter instance
app.get('/stream', function(request, response){
response.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
Stream.on("push", function(event, data) {
response.write("event: " + String(event) + "\n" + "data: " + JSON.stringify(data) + "\n\n");
});
});
setInterval(function(){
Stream.emit("push", "test", { msg: "admit one" });
}, 10000)
Upvotes: 0
Reputation: 4333
Self-promotion: I wrote the ExpreSSE package that provides middlewares for working with SSE in express, you can find it on npm: @toverux/expresse.
A simple example:
router.get('/events', sse(/* options */), (req, res) => {
let messageId = parseInt(req.header('Last-Event-ID'), 10) || 0;
someModule.on('someEvent', (event) => {
//=> Data messages (no event name, but defaults to 'message' in the browser).
res.sse.data(event);
//=> Named event + data (data is mandatory)
res.sse.event('someEvent', event);
//=> Comment, not interpreted by EventSource on the browser - useful for debugging/self-documenting purposes.
res.sse.comment('debug: someModule emitted someEvent!');
//=> In data() and event() you can also pass an ID - useful for replay with Last-Event-ID header.
res.sse.data(event, (messageId++).toString());
});
});
There is also another middleware to push the same events to multiple clients.
Upvotes: 2
Reputation: 156
It appears from the documentation on the library you're using that you should use a res.sse
when using that as middleware on a function. See:
https://www.npmjs.com/package/server-sent-events
But, all this is actually doing from their code is wrapping res.write
as you mentioned. See:
https://github.com/zacbarton/node-server-sent-events/blob/master/index.js#L11
Upvotes: 1