Reputation: 3347
I have an application which does a lot of work with large in-memory data lists. Currently i use a singleton datacache structure as follows:
public class DataCache {
private final Map<String, Map<String, ArrayList<String>>> arraysMap = new HashMap();
private final Map<String, Map<String, Integer>> integerMap = new HashMap();
private final Map<String, Map<String, String>> stringMap = new HashMap();
private final Map<String, Map<String, MyObject>> arraysMap = new HashMap();
/**
* Map for boolean locks used to temporaly disable
* edition while the data changes.
*/
private final Map<String, Map<String, Boolean>> locksMap = new HashMap();
private DataCache() {
}
private static final DataCache dataCache = new DataCache();
public static DataCache getInstanceOf() {
return dataCache;
}
...getters/setters...
}
the arraysMap
and myObjectMap
are getting the majority of workload - something about 20-50/s list operations such as put
/replace
/delete
and similar for the myObjectMap
(sum for all objects). maps
are accessed in multi-threaded context. Initially the this kind of load wasn't expected, but it seems to be working just fine.
However, it seems like the the data complexity will grow as well as the load.
Question is - is it reasonable to switch to some external solution (redis
?) or keeping this improvisation might be sufficient. The load might increase to max 500/s operations per map. Lists in arraysMap
can be as big as 200mb.
Upvotes: 0
Views: 2037
Reputation: 7077
it is reasonable to switch to external cache solution such as redis/memcache, the most important benefit is you can scale-out your application easily when the work is becoming larger, you only need to deploy your application in other machines
Upvotes: 1