Reputation: 4408
I was wondering how much do you win by putting all of your css scripts and stuff that needs to be downloaded in one file?
I know that you would win a lot by using sprites, but at some point it might actually hurt to do that.
For example my website uses a lot of small icons and most of the pages has different icons after combining all those icons together i might get over 500kb in total, but if i make one sprite per page it is reduced to almost 50kb/page so that's cool.
But what about scripts js/css
how much would i win by making a script for each page which has just over ~100 lines? Or maybe i wouldn't win at all?
Question, basically i want to know how much does a single request cost to download a file and is it really bad to to have many script/image files with todays modern browsers and a high speed connections.
EDIT
Thank you all for your answers, it was hard to chose just one because every answer did answer my question, I chose to reward the one that in my opinion answered my question about request cost the most directly, I will not accept any answer as correct because everyone was.
Upvotes: 17
Views: 4905
Reputation: 7697
Is really bad to have a lot of 100-lines-files and is also really bad to have just one or two big files, though for each type css/js/markup.
Desktops have mostly high speed connection, and mobile has also high latency.
Taking all the theory about this topic, i think the best approach shall be more practical, less accurate and based upon actual connection speed and device types from a statistical point of view.
For example, i think this is the best way to go today:
1) put all the stuff needed to show the first page/functionality to the user, in one file, shall be under 100KB - this is absolutely a requirement.
2) after that, split or group the files in sizes so that the latency is no longer noticeable together with the download time.
To make it simple and concrete, if we assume: time to first byte is around ~200ms, the size of each file should be between ~120KB and ~200 KB, good for the most connections of today, averaged.
Upvotes: 1
Reputation: 33397
Actually very relevant question (topic) that many web developer face.
I would also add my answer among other contributors of this question.
High performance web sites depending on different factors, here is some consideration:
By defining your need you might be able to find the proper strategy.
What regards your website. I recommend you to look at Steve Souders 14 recommendation in his Book High Performance Web Sites.
Steve Souders 14 advice:
So if we take js/css
in consideration following will help a lot:
page1
, page2
, page3
and page4
. Page1
and page2
uses js1
and js2
Page3
uses only js3
Page4
uses all js1
, js2
and js3
Conclusion
I am pretty sure there is more details to write. And not all advice are necessary to implement, but it is important to be aware of those. As I mentioned before, I suggest you reading this tiny book, it gives you more details. And finally there is no perfect final solution. You need to start some where, do your best and improved it. No thing is permanent.
Good luck.
Upvotes: 2
Reputation: 1052
How much you win if you use any of these types might vary based on your specific needs, but I will talk about my case: in my web application we don't combine all files, instead, have 2 types of files, common files, and per page files, where we have common files that needed globally for our application, and other files that is used for its case only, and here is why.
Above is a chart request analysis for my web application, what you need to consider is this
DNS Lookup happens only once as it cached after that, however, DNS name might be cached already, then.
On each request we have:
request start + initial connection + SSL negotiation+ time to first byte + content download
The main factor here which takes majority of request time in most cases is the content download size, so if I have multiple files that all of them needed to be used in all pages, I would combine them into one file so I can save the TCP stack time, on the other hand, if I have files needed to be used in specific pages, I would make it separate so I can save the content download time in other pages.
Upvotes: 3
Reputation: 1136
the answer to your question is it really depends.
the ultimate goal of page load optimization is to make your users feel your page load is fast.
some suggestions:
do not merge common library js css files like jquery coz they might have already cached by brower when you visited other sites so u don't even need to download them;
merge resources, but at least separate first screen required resouces and the others coz the earlier user could see some meaningful stuff, the faster they feel about your page;
if several of your pages shared some resources, separate the merged files for shared resources and page specific resources so that when you visit the second page, the shared ones might have already been cached by browser, so the page load is faster;
user might be using a phone with slow or inconsistent speed 3g/4g network, so even 50k of data or 2 more requests does make them feel different a lot;
Upvotes: 1
Reputation: 3572
I think @DigitalDan has the best answer.
But the question belies the real one, how do I make my page load faster? Or at least , APPEAR to load faster...
I would add something about "above the fold": basically you want to inline as much as will allow your page to render the main visible content on the first round trip, as that is what is perceived as the fastest by the user, and make sure nothing else on the page blocks that...
Archibald explains it well:
https://www.youtube.com/watch?v=EVEiIlJSx_Y
Upvotes: 3
Reputation: 6908
Your question isn't answerable in a real generic way.
There are a few reasons to combine scripts and stylesheets.
Browsers using HTTP/1.1 will open multiple connections, typically 2-4 for every host. Because almost every site has the actual HTML file and at least one other resource like a stylesheet, script or image, these connections are created right when you load the initial URL like index.html
.
That said, you could split your files across multiple hosts (e.g. an additional static.example.com
), which increases the number of hosts / connections and can speed up the download. On the other hand, this brings additional overhead, because of more connections and additional DNS lookups.
On the other hand, there are valid reasons to leave your files split.
The most important one is HTTP/2. HTTP/2 uses only a single connection and multiplexes all file downloads over that connection. There are multiple demos online that demonstrate this, e.g. http://www.http2demo.io/
If you leave your files split, they can also be cached separately. If you have just small parts changing, the browser could just reload the changed file and all others would be answered using 304 Not Modified
. You should have appropriate caching headers in place of course.
That said, if you have the resources, you could serve all your files separately using HTTP/2 for clients that support it. If you have a lot of older clients, you could fallback to combined files for them when they make requests using HTTP/1.1.
Upvotes: 8
Reputation: 1794
Tricky question :)
Of course, the trivial answer is that more requests takes more time, but that is not necessarily this simple.
So to put it together:
Please, ensure that you're not serving static content dynamically. I.e. try to load the image to a web browser, open the network traffic view, reload with F5 and see that you get 304 Not modified from the server, instead of 200 OK and real traffic. The biggest performance optimization is that you don't pull anything from the server, and it comes out of the box if used properly :)
Upvotes: 5
Reputation: 2637
Multiple requests means more latency, so that will often make a difference. Exactly how costly that is will depend on the size of the response, the performance of the server, where in the world it's hosted, whether it's been cached, etc... To get real measurements you should experiment with your real world examples.
I often use PageSpeed, and generally follow the documented best practices: https://developers.google.com/speed/docs/insights/about.
To try answering your final question directly: additional requests will cost more. It's not necessarily "really bad" to have many files, but it's generally a good idea to combine content into a single file when you can.
Upvotes: 10