user3363154
user3363154

Reputation: 115

Am I serving http or http2?

Background

I have been hosting multiple node.js apps on a single VPS for awhile and all has been good.

I am able to do so by using nginx to route the traffic to the different ports required by the node.js app.

Upgrading to http2

With the recent push for http2, i have been trying to support my apps by enabling http2. On nginx, i was able to do so by following the guide by nginx.

The problem

While doing a benchmark test, Performance doesn't seem to increase while using http2 on a test website that fetches 47 requests for the home page.

Performance for http and http2 seem similar.

Theory

Could it be due to nodejs serving http1 and not http2?

Do i have to set anything between the nginx and my nodejs app? e.g proxy_http_version: 2.0 However, proxy_http_version 2.0 doesn't seem to be available yet.

My nodejs app serves with express so am i really getting a http2 connection or getting a http connection instead?

When i use an http2 indicator, it tells me that the website is using http2 by indicating a blue light. Do i have to make use of molnarg's http2 module or is the nginx http2 module sufficient?

Hope somebody with more dev ops experience can help clear this up for me and hopefully many other developers.

Upvotes: 4

Views: 2377

Answers (1)

Barry Pollard
Barry Pollard

Reputation: 45970

So there's a few things to be aware of.

For a start it's worth going back to how http/2 is different to http/1.1, where that will help, and where it will not.

Http/2 will primarily benefit latency and when lots of resources need to be downloaded over low latency connections. There's a few other benefits too but that's the main performance benefit.

Under http/1.1 if you request a page (say index.html), and it loads 10 CSS resources and 10 javascript resources, then the browser will request each of those 20 resources in turn, waiting for the request to go all the way from the browser to the server and back until it can request the next resource. The server probably can get the resource really quickly (especially for static resources like css and javascript) so most of this time will be spent travelling back and forth across the Internet rather than on processing it on either end. These numbers are small (e.g. 100ms round trip) but times multiple requests they add up (e.g. 2,000ms or two seconds for this example -ignoring processing time on either side). Browsers try to get around this by opening multiple requests to the server (typically 4-6) so when there's a queue of requests (like the 20 request here) then they can get 4-6 in parallel rather than waiting for each request to complete serially. So in this example we might be able to split it to load 5 of the 20 resources over each of the 4 connections so it might only take 500ms to download them all - a good improvement over a single connection. But it's a bit of a fudge and has its own issues in setting up and managing those extra connections.

Http/2 aims to reduce the impact of this by allowing requests to be sent in any order across a single connection, without having to wait fir a response. So the browser can request those 20 resources over a single connection, one after the other, without having to wait for each to come back. So the requests are traveling across the Internet in at the same time in parallel. So in best case scenario we just need to wait for the length of a single round trip (100ms) for all 20 resources to be delivered.

The key point is that http/2 is not "faster" in itself for a single resource. It's just faster for many resources. In an totally optimal scenario (e.g. latency of 0ms) the http/1.1 and http/2 requests will be pretty much identical with no real performance improvements.

So, with that theory behind us, lets point out a few things with your specific scenario:

  1. It matters less about the Nginx->Node connection than the Browser->Nginx. This is because the latency will be multitudes worse for that connection (assuming that your Nginx and Node sit on servers which are close enough together (or perhaps even the same server) so latency is minimal between those). I would imagine your Nginx->Node connection is still http/1.1 but that's not really a problem for this reason, and also because Nginx will be able to open multiple connections to different node servers at the same time (and even multiple connections to the same node servers for that matter).

  2. Where are you testing from and how many hops to your webserver? If testing on a corporate network for example, and on same network as data centre, then latency will be low and so improvement may not be noticeable. I see from your screenshots you set your connection to "Wi-Fi" in developer tools but that still may not be slow enough to see the performance increase.

  3. Is latency a problem for your webpage/app? It could be your website is so super optimised that it gets the resources it needs, in the order it needs, and can barely keep up with all 20 requests coming in as it is. And then it takes time at either end to process those requests/responses - either in browser to parse and render CSS/JavaScript or in server to serve up the requests (e.g. because it has to do a lot of processing or connect to DB to return that resource). So network latency may not actually a problem for your website (though, but for all but the simplest website it is a problem).

  4. How to measure improvements, given http/2 is still fairly bleeding edge? I've heard anecdotal evidence that the Chrome Developer tools are not yet accurately reporting http/2 times accurately and are under selling the benefits. And other performance testing tools may not even be http/2 aware at all so may fallback to http/1.1. Here's a list of http/2 testing resources btw but, like Chrome developer tools potentially does, most seem to be useful to test whether you're using http/2 rather than any performance benefit of it: https://blog.cloudflare.com/tools-for-debugging-testing-and-using-http-2/

Upvotes: 7

Related Questions