Reputation: 5687
Im looking for a guru on caching and ajax loaded pages :-)
I have read a lot of input about the subject but are still unsure what the best way is to cache ajax pages. I would like to be sure that Im doing all I can to make my iPhone web app as fast as possible to load and navigate on.
This is what I do and have: Im developing a iPhone web app with jqtouch and phonegap. Im including all js files,css files, index page, menu icons in the app when it is downloaded from the App Store. The js and css files are minified.
All my subpages is loaded with ajax from my dedicated server. All subpages are .asp pages that gets its content from a mysql database every time a page is loaded.
Since the iPhone cache pages I have to delete all ajax pages when I have visited them, otherwise an update wouldnt be visible. This is not the best way of doing things.
Instead i would like to not delete the ajax pages and use the cache-controll.
This is how I think it should work: Turn on cache-controll on the server(how is this done?) In the app, check the Last-Modified date and if it is not changes - read from the cache. If its changed - get the files from the server.
Is this the best way of doing things? E-tag instead?
I would like to know, how to set the windows 2008 server with IIS 7 with the right cache controll. How to write the correct header in the index files, and if I need to write som asp-read headers in my asp ajax pages?
I hope somebody now how to do this? Any input appriciated, thanks!
Upvotes: 1
Views: 1270
Reputation: 516
I have just found out additional info about safari cache interaction with the web app.
Here the link
It seems that the content you want to cache needs to be specified in the manifest file.
Additionally the link explains how to trigger cache invalidation and content refresh via javascript.
Upvotes: 1
Reputation: 516
It sounds strange to me that iPhone caches the pages locally and that you have to clean up the cache to get fresh content.
In fact there must be some HTTP cache headers which tell iPhone browser to do that. It should not matter if the browser is Safari on iOS, or opera on Android, or Firefox on a windows, because all modern browser "should" conform to the HTTP 1.1 RFC, even the mobile ones. Surely they might have some differences to save bandwidth, phone battery and phone cpu cycles, but definitely they should allow users to get fresh content from a website. So I agree with you, cleaning the visited pages is not the right way of doing things. I definitely think that there is a problem somewhere. Try to call the API via a computer browser and inspect the requests via firebug, or the native debugging tool of chrome, to check the server response headers, and to see if the behavior is the same.
Normally I use a mix of max-age and if-modified-since cache validators. I set the max-age of some of the images to an high value (in my case 365 days, most probably not suitable for you), for some others to a week, and a day of css and javascript. Additionally I use the if-modified-since so I have a good balance between freshness of the content, server load and bandwidth usage. The ETAG is basically the same thing as if-modified-since, with the difference you can set it to a weak cache validator, and it is really useful if the clock of the server is not reliable. (see below on the extended answer for more info and references to docs)
To enable the cache, IIS7 makes it really easy for you with the output caching IIS module. Here is a good doc about it, on how to enable it, and the difference between kernel and user mode cache.
Unless you will need to do really particular stuff, I think you will not need to create code for it, and IIS will take care of everything. If that's not the case, then here is another good guide which will help you out.
If you really want a fast website then I would recommend you to enable compression, and to add a cache layer between the application and the database. So you will not execute queries to the DB on every requests.
Here are some more good performance guidelines for any web developer.
Extended Answer (maybe even too extended :) )
There are different ways of caching the content on the client side via HTTP 1.1. The most commons, or at least the ones I used the most are:
1) The max-age header is sent from the server, on the HTTP response headers, and basically tells the client's browser to store the content on its cache, for a period of time specified in seconds.
For instance, let's assume a server gives back a max-age=60 in response to a GET client http request of logo.jpg. The client's browser then stores the logo.jpg and for the next 60 seconds it will serve the image from its own cache. In other words there will be no HTTP requests for this specific image for the next 60 seconds. So with the max age the content is cached on the client side, and will not be requested or revalidated with the server for the amount of seconds specified in the max-age header. There is however normally the possibility to force this revalidation/refresh by pressing CTRL-F5 on windows browser and CMD-R on mac browsers. On the mobile devices, normally the functionality is on the browser menu, and it is called refresh. This is the appropriate section of the RFC.
PROS
CONS
2) The last-modified server side http response header together with the client side http request header if-modified-since, is another good mechanism for speeding up the sites and to save some money. It basically works in this way.
A browser requests content for the first time to a server via a GET request. The server responds with a 200 and gives back the content together with a last-modified header. The last-modified header's value is nothing else than the actual date and time when the content has been modified last time. (date time must be in UTC because of the timezones) At this point all the following HTTP requests for the same content coming from the client will have an additional header called: if-modified-since with the date and time received from the server as value. The server when receives the following requests will check the if-modified-since header value and it will compare it with the last-modified date of the content. If the data is the same, and therefore the content has not changed, the server will respond with a 304 and basically with no content. (most important part !) The browser then knows it has to keep still the content on the cache and load it from it because it has not changed on the server side. The process will continue till the content on the server changed and therefore the server will provide a new last-modified date and fresh content. This as you can see, can save a lot of bandwidth, especially in the case of images, or JS or css, without giving up content freshness. Section 14.25 of the spec explains things much better than I did. :)
PROS
CONS
3) The ETAG is a similar process like the if-modified-since, with a difference that the value of the header is normally an hash of the server side content, and the client on the http requests sends an header called: if-none-match.
The pros and cons are the same as the point 2.
You might now wonder than what is the main difference between point 2 and point 3. The problem with point 2 is actually the server clock. In fact there could be problems serving back the last-modified date from the server if it has clock problems
The subject is quite deeper, cause the best practice is to send a weak and a strong validator. See section 13.3.4 for more information.
Upvotes: 2