Jack M.
Jack M.

Reputation: 3804

Autobahn VS Einaros - Websockets with Node JS

I need to write a WebSocket server and I am learning Node JS by reading some books I purchased. This server is for a very fast game so I need to stream small messages to groups of clients as quick as possible.

What is the difference between:

Autobahn | JS : http://autobahn.ws/js/

and

Einaros : https://github.com/einaros/ws

?

I have heard that Autobahn is very powerful and capable to deal with 200k clients without a load balancer so I was wondering if someone with more experience could advise me whether there is any advantage in opting for one or another library.

Upvotes: 1

Views: 1612

Answers (1)

oberstet
oberstet

Reputation: 22051

The functional difference is: Einaros is a WebSocket library, whereas Autobahn provides WebSocket implementations (e.g. AutobahnPython), plus WAMP on top of WebSocket.

WAMP provides higher-level communication for apps (RPC + PubSub - pls see the WAMP website). And AutobahnJS is a WAMP implementation for browsers (and NodeJS) on top of WebSocket.

Now, say you don't care about WAMP, and hence only need a raw WebSocket server. Then you can compare AutobahnPython with Einaros primarily based on non-functional characteristics, like protocol compliance, security and performance.

Autobahn has best-in-class protocol compliance. I dare to say that, since the Autobahn project also provides the quasi industry standard WebSocket testsuite - used by most projects - including Einaros. Autobahn has 100% strict passes on all tests. Einaros probably also - I don't know.

Performance: yes, a single AutobahnPython based WebSocket server (4GB RAM, 2 cores, PyPy, FreeBSD in a VirtualBox VM) can handle 200k connected clients. To give you some more data points: here is a post with performance benchmarks on the RaspberryPi.

In particular, this post highlights the most important (IMO) metric: 95%/99% quantile messaging latency. You shouldn't look only at average latency, since there can be big skews and massive outliers. What you want is consistent low latency.

Achieving consistent low latency is non-trivial. E.g. one factor for languages/run-times like NodeJS or PyPy (a JITted Python implementation) is the garbage collector. Every time the GC runs, it'll slow stuff done - potentially introducing large latencies in messaging. I have done extensive benchmarking (unpublished) which indicates that PyPy's incremental GC is very good in this regard. Better than HotSpot (JVM) and NodeJS (Google V8). When in doubt, and since I haven't (yet) published numbers, you shouldn't believe me, but measure yourself.

The one thing I'd strongly recommend: don't rely on average latency, measure quantiles, do histograms.

Disclose: I am original author of Autobahn and work for Tavendo.

Upvotes: 6

Related Questions