Reputation: 1925
How would you cache an ActiveResource model? Preferably in memcached. Right now it's pulling a model from my REST API fine but it pulls dozens of records each time. Would be best to cache them.
Upvotes: 3
Views: 2541
Reputation: 793
I would suggest looking into https://github.com/Ahsizara/cached_resource, almost all of the work is done for you through the gem.
Upvotes: 1
Reputation: 108
I've been playing around with the same thing and I think I've found a pretty simple way to check redis for the cached object first. This will only work when you use the find method, but for my needs, I think this is sufficient.
By overriding find, I can check the checksum of the arguments to see if I already have the response saved in redis. If I do, I can pull the JSON response out of redis and create a new object right there. If I don't, I'll pass the find through to ActiveResource::Base's find and the normal action will happen.
I haven't implemented the saving of the responses into redis with ActiveResource yet, but my plan is to populate those caches elsewhere. This way, normally I can rely on my caches being there, but if they aren't, I can fall back to the API.
class MyResource < ActiveResource::Base
class << self
def find(*arguments)
checksum = Digest::MD5.hexdigest(arguments.md5key)
cached = $redis.get "cache:#{self.element_name}:#{checksum}"
if cached
return self.new JSON.parse(cached)
end
scope = arguments.slice!(0)
options = arguments.slice!(0) || {}
super scope, options
end
end
end
and a little patch so we can get an md5key for our array:
require 'digest/md5'
class Object
def md5key
to_s
end
end
class Array
def md5key
map(&:md5key).join
end
end
class Hash
def md5key
sort.map(&:md5key).join
end
end
Does that help?
Upvotes: 4
Reputation: 3307
Caching in rails is configurable. You can configure the cache to be backed by memcached. Typically you can cache when you retrieve. It's unclear if you are a rest consumer or service but it's really not relevant. If you cache on read (or retrieve) and then read the cache the next time, everything will work just fine. If you are pulling the data from a database, serve up the cache and if no cache is available then cache the read from the database.
I wrote a blog post about it here: http://squarism.com/2011/08/30/memcached-with-rails-3/
However what I wrote about is really pretty simple. Just showing how to avoid an expensive operation with what is sort of similar to the ||= operator. For a better example, new relic has a scaling rails episode. For example, they show how to cache the latest 10 posts:
def self.recent
Rails.cache.fetch("recent_posts", :expires_in => 30.minutes) do
self.find(:all, :limit => 10)
end
end
Rails.cache has been configured to be a memcached cache, this is the configurable part I was talking about.
Upvotes: 1