Brave Soul
Brave Soul

Reputation: 3620

Increase Concurrent HTTP calls

I went through many posts for this on SO but not found any suitable solution

I got it from one of the answers for maximum concurrent connection to one domain limit

IE 6 and 7:      2
IE 8:            6
IE 9:            6
IE 10:           8
IE 11:           8
Firefox 2:       2
Firefox 3:       6
Firefox 4 to 46: 6
Opera 9.63:      4
Opera 10:        8
Opera 11 and 12: 6
Chrome 1 and 2:  6
Chrome 3:        4
Chrome 4 to 23:  6
Safari 3 and 4:  4

How to call more than the maximum http calls set by browsers to one domain.

I went through this

One trick you can use to increase the number of concurrent conncetions is to host your images from a different sub domain. These will be treated as seperate requests, each domain is what will be limited to the concurrent maximum.

IE6, IE7 - Have a limit of two. IE8 is 6 if your a broadband, 2 if you are dial up.

but i don't have scenario like this. I am fetching specific data which is pointing to one web server. how will I overcome this.

I am having 14 http calls to same server at starting which is the reason it takes long to load the actual page. How to increase performance of my website through concurrent ajax/http calls

Upvotes: 15

Views: 7502

Answers (7)

Andrii Muzalevskyi
Andrii Muzalevskyi

Reputation: 3329

14 requests is not an issue. It becomes issue only if server response time is large. So most likely the root issue is the server side performance

These solutions are possible:

  • use HTTP cache (server should send corresponding headers)
  • use cache at the middle (e.g. CDN, Varnish)
  • optimize server side
    • content related:
      • combine several requests into one
      • remove duplicated information in requests
      • do not load information which client doesn't render
    • use cache at server side
    • etc... any other approach... there are plenty of them.

UPDATE:

Suggestions for people who have to download static resources and have troubles with that...

  1. Check size of resources and optimize where possible

  2. Use HTTP2 - it shares connection between requests, so server will be less loaded and respond faster, mostly because it doesn't need to establish separate SSL connection per each request (web is secure nova days, everybody use HTTPS)

  3. HTTP specification limits number of parallel requests to single domain. This leaves chance to increase count of parallel requests using several different domains (or subdomains) to download required resources

Upvotes: 8

Marinos An
Marinos An

Reputation: 10816

If your server is able to perform tasks concurrently, there is no added value in opening multiple connections to it, other than being able to update your UI earlier (before the completion of the slowest tasks). E.g. To enable users use the page before all the tasks have finished. This offers a good user experience.

However, there are other solutions to achieve this, than opening parellel http connections. In short you can merge all your endpoints under one endpoint that handles all tasks, and asynchronously place the result of each finished task to the response. The client can then process the result of each task at the time it finishes.

To achieve the above you need some short of protocol/api that operates on top of an http connection, or a connection that has been upgraded to websocket. Below there are some alternatives which provide the async response message feature:

Upvotes: 1

Nurbol Alpysbayev
Nurbol Alpysbayev

Reputation: 21881

Just use Http2. Http1 has a limit on concurrent connections, depending on the browser about 6-10.

Upvotes: 1

Anton Boritskiy
Anton Boritskiy

Reputation: 1569

I have a data driven application. ... I have google map, pivot, charts and grids

In comments you mentioned that data is coming from different providers, in case those are on different domains - try using dns-prefetch, i.e. like this:

   <link rel="dns-prefetch" href="http://www.your-data-domain-1.com/">
   <link rel="dns-prefetch" href="http://www.your-data-domain-2.com/">
   <link rel="dns-prefetch" href="http://www.3rd-party-service-1.com/">
   <link rel="dns-prefetch" href="http://www.3rd-party-service-2.com/">

You need to list all domains which you are calling via AJAX and which are not the actual domain of your website itself.

It will force browser to send the DNS request as soon as it reads and parses that HTML data and not when your code requests the data from those domains for the first time. It might save you up to few hundreds of milliseconds when the browser will be actually doing an AJAX request for the data.

See also:

Upvotes: 1

Emeeus
Emeeus

Reputation: 5250

How to call more than the maximum http calls set by browsers to one domain.

That is a HTTP/1.1 limit (6-8), If you are able to change the server (you tag this question as http), the best solution is using HTTP/2 (RFC 7540) instead of HTTP/1.1.

HTTP/2 multiplex many HTTP requests on a single connection, see this diagram. When HTTP/1.1 has a limit of 6-8 roughly, HTTP/2 does not have a standard limit but say that "It is recommended that this value (SETTINGS_MAX_CONCURRENT_STREAMS) be no smaller than 100" (RFC 7540). That number is better than 6-8.

Upvotes: 5

Pogrindis
Pogrindis

Reputation: 8091

Just to extend on Charly Koza's answer as this has some limitations depending on user count etc.

First thing you should look at is into using CDNs, I will assume you have done this already.

The fact you are only hitting one server is not a problem, the browser will allow concurrent connections based on DNS host and not just IP.

If you have access to your DNS management and can dynamically spawn up a new subdomain, look to free services like CloudFlare's API.

Alternatively, create a wildcard domain, which will allow any subdomain to point to 1 server.

In this way, on your server side, you can identify if the user already has X amount of connections active, if so, the following scenario can be done:

  • Dynamically create a new Subdomain on the same IP, or if using the Wildcard, create a random subdomain newDomainRandom.domain.com
  • Then return the user a 301 redirect to the new domain, the users Internet Client will then register this as a new connection to another domain.

There is a lot of Pseudo work here, but this is more of a networking issue that a coding issue.

Compulsory warning on this method though :

There are no limits in using 301 redirects on a site. You can implement more than 100k of 301 redirects without getting any penalty. But: Too many 301 redirects put unnecessary load on the server and reduce speed.

Upvotes: 5

Cactusbone
Cactusbone

Reputation: 1076

what you can do is dispatch that load to many subdomains. Instead of using only www, you use www1, www2, www3, www4 and round robin between those clientside.

You'll need to configure your web server so that www* subdomains ends up to the same place.

Upvotes: 9

Related Questions