AlessandraFrancez
AlessandraFrancez

Reputation: 21

How can I rate limit the /socket.io route on my http server?

I'm using socket.io and used rate-limiter-flexible and limiter to limit request rate, but I noticed the route /socket.io isn't receiving 429 from either, in fact, I can't log any requests coming from this route by using app.use('/socket.io')..

I think socket.io is doing some treatment on this route under the hood, is that correct? If so, how can I make sure requests to /socket.io also receive 429 after the limit is reached?

Rate limiter on Connect:

this.io.on(this.SocketConstants.CONNECTION, async (client) => {
      this.client = client;
      try {
        await this.rateLimiter.consume(client.handshake.address);
      } catch (rejRes) {
        // On flood
        client.error('Too many requests');
        client.disconnect(true);
      }

Rate limiter on http server:

class RateLimiting {
    constructor() {
        this.limiter = new RateLimiter(2, 'minute', true);
        this.index = this.index.bind(this);
    }

    index(req,res,next){
        try {
            this.limiter.removeTokens(1, function(err, remainingRequests) {
                if (remainingRequests < 1) {
                    res.writeHead(429, {'Content-Type': 'text/plain;charset=UTF-8'});
                    res.end('429 Too Many Requests - your IP is being rate limited');
                    // res.status(429).json({ message: 'Too many requests' });
                } else {
                    next();
                }
            });
        }
        catch(err){
            console.log('Error', err);
        }
    }
}

Edit

To anyone that has a similar problem, I ended up trying several different ways to accomplish this and the easiest where these:

  1. Write your own allowRequest

AllowRequest is a pass/fail function that you can use to override the default "checkRequest" function (reference here)

checkBucket(err, remainingRequests) {
    remainingRequests < 1 ? false : true;
  }

allowRequest(req, callback) {
    const limit = this.limiter.removeTokens(1, this.checkBucket);
    callback(limit ? null : 'Too many requests', limit);
  }

And Server is started as

this.io = this.socketServer(server, 
        { 
          allowRequest: this.allowRequest
        });
  1. Pick a different route, use express to add any middlewares needed and disable the default route

You can accomplish this by setting serveClient to false and setting the path to whatever you need.

this.io = this.socketServer(server, { path: '/newSocketRoute', serveClient: false });

This didn't quite work for me, so there's probably something wrong in the way I'm serving the socket files, but this is what it looked like:

app.get("/socket", this.limiter.initialize(), function(req, res) {
        if (0 === req.url.indexOf('/newSocketRoute/socket.io.js.map')) {
          res.sendFile(join(__dirname, "../../node_modules/socket.io-client/dist/socket.io.js.map"));
        } else if (0 === req.url.indexOf('/newSocketRoute/socket.io.js')) {
          res.sendFile(join(__dirname, "../../node_modules/socket.io-client/dist/socket.io.js"));
        }
      });

Upvotes: 2

Views: 6088

Answers (1)

jfriend00
jfriend00

Reputation: 707218

Socket.io attaches itself to your http server by inserting itself as the first listener to the request event (socket.io code reference here) which means it totally bypasses any Express middleware.

If you're trying to rate limit requests to /socket.io/socket.io.js (the client-side socket.io code), then you could create your own route for that file in Express using your own custom router with a different path and have your client just use the Express version of the path and you could then disable serving that file through socket.io (there's an option to disable it).

If you're trying to rate limit incoming socket.io connections, then you may have to modify your rate limiter so it can participate in the socket.io connect event.


Now that I think of it, you could hack the request event listener list just like socket.io does (you can see the above referenced code link for how it does it) and you could insert your own rate limiting before it gets to see the request. What socket.io is doing is implementing a poor man's middleware and cutting to the front of the line so that they can get first crack at any incoming http request and hide it from others if they have handled it. You could do the same with your rate limiter. Then, you'd be in an "arm's race" to see who gets first crack. Personally, I'd probably just hook the connect event and kill the connection there if rate limiting rules are being violated. But, you could hack away and get in front of the socket.io code.

Upvotes: 1

Related Questions