Reputation: 4636
I'm aware that App Engine has memcache build it. I'm using objectify, so all I have to do it add an annotation to use it.
I see one downside of the memcache solution and that is that Google have control. If they want to free up some memory on the server to supply another instance then they can empty my cache and I have to pay to refill it.
For this reason I'm wondering about creating my own cache as a hashmap of instances of objectify entities but I have two concerns that I'm struggling to find information on.
How can I monitor the memory usage of an instance to make sure I don't get too close to my 128mb limit?
I understand that objectify caches properties rather than whole entities in memcache. Is there a technical reason as to why I can't cache instances objectify entities?
Upvotes: 0
Views: 157
Reputation: 2725
Note that there's also dedicated memcache, which might be an easier solution.
But to make sure you don't go over the memory limit you can probably use something like Ehcache and configure a limit.
As to your second question: as far as I know Objectify behaves like that to handle serialization and deserialization better when you deploy a new version and your classes might have slightly changed. When doing your own caching on your own instances you don't have that problem. However, eviction of old items from multiple instances is always tricky, so be careful with that.
Upvotes: 1
Reputation: 230
I am assuming this is for google app engine - python.
Yes. We are using cachepy currently in production with a decent success.
https://code.google.com/p/cachepy/
Note that this is an instance specific cache mechanism.
So you could use cachepy to cache it per-instance and also to memcache as a fallback before persiting to the datastore.
So when a cache miss happens on cachepy you can look up against memcache and when a cache miss happens on that as well you can fallback to to get it from the datastore.
For monitoring usage, you could do something like this ( although I am not sure if it will work as I intend in gae )
on the set function on cachepy https://code.google.com/p/cachepy/source/browse/cachepy.py#65
you could add a check to see the size of the cache using sys.getsizeof
and throw a MemoryError
or something like that.
Upvotes: 1