Reputation: 3025
My JavaScript application has to load many micro text files from my web server asynchronously (there're about 200 * 5kB files). I know that downloading 1 large file is far more fast than downloading many tiny files. But I cannot predict which files are gonna be loaded (the client makes requests) and I have tons of files like this.
How can I speed up the transfer of those files?
I thought about concatenating requested files with PHP. Is that a good idea?
Upvotes: 0
Views: 60
Reputation: 8009
"I thought about concatenating requested files with PHP. Is that a good idea?"
We do the same thing in production with a servlet in java and it works quite well. But, to get it right we had to cache the concatenated files, don't read them for each request. The file IO has a lot of overhead.
Here's a list of PHP cache tools. Given a highly cursory look at the doc for xcache you should be able to write a php file that collects all of your individual files and concatenates them and then store that in memory to be used as a resource.
Upvotes: 1