Saif Bechan
Saif Bechan

Reputation: 17121

Streaming data with Node.js

I want to know if it is possible to stream data from the server to the client with Node.js. I want to post a single AJAX request to Node.js, then leave the connection open and continuously stream data to the client. The client will receive this stream and update the page continuously.

Update:

As an update to this answer - I cannot get this to work. The response.write is not sent before you call close. I have set up an example program that I use to achieve this:

Node.js:

var sys = require('sys'), 
http = require('http');
http.createServer(function (req, res) {
    res.writeHead(200, {'Content-Type': 'text/html'});
    var currentTime = new Date();
    setInterval(function(){
        res.write(
            currentTime.getHours()
            + ':' + 
            currentTime.getMinutes()
            + ':' +
            currentTime.getSeconds()
        );
    },1000);
}).listen(8000);

HTML:

<html>
    <head>
        <title>Testnode</title>
    </head>

    <body>
        <!-- This fields needs to be updated -->
        Server time: <span id="time">&nbsp;</span>

        <!-- import jQuery from google -->
        <script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>

        <!-- import jQuery -->
        <script type="text/javascript">
            $(document).ready(function(){
            // I call here node.localhost nginx ports this to port 8000
                $('#time').load('http://node.localhost');
            });
        </script>
    </body>
</html>

Using this method I don't get anything back until I call close(). Is this possible or should I go with a long poll approach instead where I call the load function again as one comes in?

Upvotes: 44

Views: 41624

Answers (4)

Bobby Iliev
Bobby Iliev

Reputation: 316

This is now possible with Node.js, pg-query-stream and Materialize.

Materialize is PostgreSQL-compatible, which means that Node.js applications can use any existing PostgreSQL client to interact with Materialize.

However unlike PostgreSQL, with Materialize you can take advantage of the incrementally updated materialized views from your Node.js application, instead of querying Materialize for the state of a view at a point in time, use a TAIL statement to request a stream of updates as the view changes.

A sample app would look like this:

import express from 'express'
import pg from 'pg'
import QStream from 'pg-query-stream'

const app = express();
const port = 3000

app.get('/questions', async (request, response) => {

    const client = new pg.Client('postgres://materialize@SERVER_IP:6875/materialize');

    await client.connect();

    const query = new QStream('TAIL your_materialized_view WITH (PROGRESS)', [], {batchSize: 1});

    const stream = client.query(query);

    response.setHeader('Content-Type',  'text/event-stream');

    for await (const event of stream) {
        if(event.id){
            response.write(`data: ${JSON.stringify(event)}\n`);
        }
    }

})

app.listen(port)

Resources:

Upvotes: 2

Kuroki Kaze
Kuroki Kaze

Reputation: 8481

It is possible. Just use response.write() multiple times.

var body = ["hello world", "early morning", "richard stallman", "chunky bacon"];
// send headers
response.writeHead(200, {
  "Content-Type": "text/plain"
});

// send data in chunks
for (piece in body) {
    response.write(body[piece], "ascii");
}

// close connection
response.end();

You may have to close and reopen connection every 30 seconds or so.

EDIT: this is the code I actually tested:

var sys = require('sys'),
http = require('http');
http.createServer(function (req, res) {
    res.writeHead(200, {'Content-Type': 'text/html'});
    var currentTime = new Date();
    sys.puts('Starting sending time');
    setInterval(function(){
        res.write(
            currentTime.getHours()
            + ':' +
            currentTime.getMinutes()
            + ':' +
            currentTime.getSeconds() + "\n"
        );

        setTimeout(function() {
            res.end();
        }, 10000);

    },1000);
}).listen(8090, '192.168.175.128');

I connected to it by Telnet and its indeed gives out chunked response. But to use it in AJAX browser has to support XHR.readyState = 3 (partial response). Not all browsers support this, as far as I know. So you better use long polling (or Websockets for Chrome/Firefox).

EDIT2: Also, if you use nginx as reverse proxy to Node, it sometimes wants to gather all chunks and send it to user at once. You need to tweak it.

Upvotes: 28

Andrey Antukh
Andrey Antukh

Reputation: 195

You can also abort the infinite loop:

app.get('/sse/events', function(req, res) {
    res.header('Content-Type', 'text/event-stream');

    var interval_id = setInterval(function() {
        res.write("some data");
    }, 50);

    req.socket.on('close', function() {
        clearInterval(interval_id);
    }); 
}); 

This is an example of expressjs. I believe that without expressjs will be something like.

Upvotes: 6

BMiner
BMiner

Reputation: 17097

Look at Sockets.io. It provides HTTP/HTTPS streaming and uses various transports to do so:

  • WebSocket
  • WebSocket over Flash (+ XML security policy support)
  • XHR Polling
  • XHR Multipart Streaming
  • Forever Iframe
  • JSONP Polling (for cross domain)

And! It works seamlessly with Node.JS. It's also an NPM package.

https://github.com/LearnBoost/Socket.IO

https://github.com/LearnBoost/Socket.IO-node

Upvotes: 20

Related Questions