Reputation: 1045
So, I got a big menu with a couple of sub-items under each menu-item. Each sub-item needs javascript-includes and in some (pretty often) cases there are 20~ includes. This obviously sucks considering the HTTP-request time blah blah.
My thoughts are the following. I'm creating a merger-file (in PHP) that will handle all the js-includes and make them into one big file. But I got some questions.
There are two cases.
Pros: Very clear that a certain sub-item got it's own js-file. Cons: A lot of files and also each time you make a change in a js-file you have to update the merged js-file manually
Pros: Easy to work with, all changes will be updated instantly since it's generated in runtime. Cons: The possibility that the processing time and energy to create the merger-file will even out the loss of 20~ HTTP requests.
So, this is a question about performance. Thoughts?
Upvotes: 0
Views: 659
Reputation: 923
You could also cache the dynamic file. For a given, unique URL you'd only generate the file once, and write it to a physical file. You could then redirect any subsequent request to that physical file.
Upvotes: 1