Biarnès Adrien
Biarnès Adrien

Reputation: 53

Overhead of separated javascript files in application loading time

My company is building a single page application using javascript extensively. As time goes on, the number of javascript files to include in the associated html page is getting bigger and bigger.

We have a program that mignifies the javascript files during the integration process but it does not merge them. So the number of files is not reduced.

Concretely this means that when the page is getting loaded, the browser requires the javascript files one by one initiating each time a http request.

Does anyone has metrics or a sort of benchmark that would indicate up to what extent the overhead in requesting the javascript files one by one is really a problem that would require to merge the files into a single one?

Thanks

Upvotes: 1

Views: 82

Answers (3)

Kev
Kev

Reputation: 5452

I think that you should have a look at your app Architecture more than thinking about what is out there.

But this site should give you a good idea: http://www.browserscope.org/?category=network

Browsers and servers may have their own rules which are different. If you search for http requests limit, you will find a lot of posts. For example the max http request limit is per domain.

But speaking a bit about software development. I like the component based approach.

You should group your files per component. Depending on your application requirements, you can load first the mandatory components and lazy load the less needed one or on the fly. I don't think you should download the entire app if it's huge and has a lot of different functionalities that may or may not all be used by your users.

Upvotes: 0

freakish
freakish

Reputation: 56477

@Patrick already explained benefits of merging. There is however also a benefit of having many small files. Browsers by default give you a maximum number of parallel requests per domain. It should be 2 by HTTP standard but browsers don't follow it anymore. This means that requests beyond that limit wait.

You can use subdomains and redirect requests from them to your server. Then you can code client in such way that it will use a unique subdomain for each file. Thus you'll be able to download all files at the same time (requests won't queue) effectively increasing performance (note that you will probably need more static files servers for this to handle the traffic).

I haven't seen this being used in real life but I think that's an idea worth mentioning and testing. Related:

Max parallel http connections in a browser?

Upvotes: 0

Patrick Hofman
Patrick Hofman

Reputation: 156978

It really depends on the number of users and connections allowed by the server and the maximum number of connections of the client.

Generally, a browser can do multiple HTTP requests at the same time, so in theory there shouldn't be much difference in having one javascript file or a few.

You don't only have to consider the javascript files, but the images too of course, so a high number of files can indeed slow things down (if you hit the maximum number of simultaneous connection from server or client). So regarding that it would be wise to merge those files.

Upvotes: 2

Related Questions