Tom
Tom

Reputation: 7091

Ajax "caching", the good, the bad, the indifferent?

So I don't actually mean browser caching of an Ajax request using the GET method, but storing large queries (any number, likely double-digits, of 40 - 300kb queries) in the the browser's memory.

What are the unseen benefits, risks associated with this?

var response = JSON.parse(xhr.responseText);
Cache.push(response); // Store parsed JSON object in global variable `Cache`
// Time passes, stuff is done ...
if(Cache[query])
    load(Cache[query])
else
    Ajax(query, cache_results);

Upvotes: 0

Views: 264

Answers (3)

Annie
Annie

Reputation: 6631

You'll probably want to stress-test the memory usage and general performance of various browsers when storing many 300kb strings. You can monitor them in task manager, and also use performance tools like Speed Tracer and dynatrace ajax edition.

If it turns out that caching is a performance win but it starts to get bogged down when you have too many strings in memory, you might think of trying HTML5 storage or Flash storage to store the strings--that way you can cache things across sessions as well. Dojo storage is a good library for this.

Upvotes: 0

mkrause
mkrause

Reputation: 1399

Is there an actual need? Or is it just optimization for the sake of? I'd suggest doing some profiling first and see where the bottlenecks lie. Remember that a web page session typically doesn't last that long, so unless you're using some kind of offline storage the cache won't last that long.

Upvotes: 3

jldupont
jldupont

Reputation: 96716

Not having the full view of your system it is hard to tell but I would think that potentially playing with stale data will be a concern.

Of course, if you have a protocol in place to resolve "cache freshness" you are on the right track.... but then, why not rely on the HTTP protocol to do this? (HTTP GET with ETag/Last-Modified headers)

Upvotes: 0

Related Questions