Reputation: 1910
AFAIK the entire idea behind reducing HTTP requests to increase website speed lies in HTTP's inability to handle concurrent requests. HTTP2 allows concurrent requests.
Is it still a performance benefit to reduce the number of HTTP requests made?
Or is it more effective to make numerous, smaller HTTP requests as they're handled concurrently?
Or is there a happy medium based on the number of concurrent requests a site/browsers can handle?
I'm specifically using nginx for this, but assume the same question applies equally to apache and other web servers.
Upvotes: 1
Views: 214
Reputation: 46080
HTTP/2 makes requests cheaper - not free.
HTTP requests still have a cost, to look up local cache, build the request, send the request, wait for a response, get the result, decide whether to cache it for next time, process the result...etc. For this reason browsers can limit the number of requests in flight at once.
Servers also typically limit connections to 100 inflight responses at once.
Others have also discovered that completely forgoing packaging leads to performance issues, in part for above reasons and also because compression (gzip and brotli) is less efficient for smaller files leading to more bytes having to be sent.
The general consensus seems to be 1) test and 2) bundle less into functional pieces, but don’t go as far as getting rid of it completely.
Upvotes: 1