Reputation: 83
I am currently trying to performance tune the UI of a company web application. The application is only ever going to be accessed by staff, so the speed of the connection between the server and client will always be considerably more than if it was on the internet.
I have been using performance auditing tools such as Y Slow! and Google Chrome's profiling tool to try and highlight areas that are worth targeting for investigation. However, these tools are written with the internet in mind. For example, the current suggestions from a Google Chrome audit of the application suggests is as follows:
Network Utilization
Web Page Performance
Are any of these bits of advice totally redundant given the connection speed and usage pattern? The users will be using the application frequently throughout the day, so it doesn't matter if the initial hit is large (when they first visit the page and build their cache) so long as a minimal amount of work is done on future page views.
For example, is it worth the effort of combining all of our CSS and JavaScript files? It may speed up the initial page view, but how much of a difference will it really make on subsequent page views throughout the working day?
I've tried searching for this but all I keep coming up with is the standard internet facing performance advice. Any advice on what to focus my performance tweaking efforts on in this scenario, or other auditing tool recommendations, would be much appreciated.
Upvotes: 1
Views: 1860
Reputation: 92745
Most of these tools provides steps or advice for one time check. However it solves few issues, it does not tell you how your user experiences your site. Always Real user monitoring is a right solution to measuring live user performances. You can use Navigation Timing API to measure page load time and resource timings.
If you want to look for service, you can try https://www.atatus.com/ which provides Real User monitoring, Ajax Monitoring, Transaction monitoring and JavaScript error tracking.
Upvotes: 1
Reputation: 29629
One size does not fit all with these things; the item that immediately jumps out as something that will have a big impact is "leverage browser caching". This reduces bandwidth use, obviously, but also tells the browser it doesn't need to re-parse whatever you've cached. Even if you have plenty of bandwidth, each file you download requires resources from the browser - a thread to manage the download, the parsing of the file, managing memory etc. Reducing that will make the app feel faster.
GZIP compression is possibly redundant, and potentially even harmful if you really do have unlimited bandwidth - it consumes resources both on the server and the client to compress the data. Not much, and I've never been able to measure - but in theory it might make a difference.
Proxy caching may also help - depending on your company's network infrastructure.
Reducing cookie size may help - not just because of the bandwidth issue, but again managing cookies consumes resources on the client; this also explains why serving static assets from cookie-less domains helps.
However, if you're going to optimize the performance of the UI, you really need to understand where the slow-down is. Y!Slow and Chrome focus on common problems, many of them related to bandwidth and the behaviour of the browser. They don't know if one particular part of the JS is slow, or whether the server is struggling with a particular dynamic page request.
Tools like Firebug help with that - look at what's happening with the network, and whether any assets take longer than you expect. Use the JavaScript profiler to see where you're spending the most time.
Upvotes: 3
Reputation: 331
Here is a list of additional services you can use to test website speed: http://sixrevisions.com/tools/free-website-speed-testing/
Upvotes: 0