Reputation: 19177
I guess I'm another person trying to make some kind of a cache with WeakHashMap. And I need some help with it.
I have bunch of TrackData
objects that contain information about audio tracks. Then there are Track
objects that keep reference to the TrackData
inside. Several tracks can point to the same TrackData
. Then I have TrackDataCache
class that looks like this:
public class TrackDataCache {
private static TrackDataCache instance = new TrackDataCache();
public static TrackDataCache getInstance() {
return instance;
}
private WeakHashMap<TrackData, WeakReference<TrackData>> cache = new WeakHashMap<TrackData, WeakReference<TrackData>>();
public void cache(Track track) {
TrackData key = track.getTrackData();
WeakReference<TrackData> trackData = cache.get(key);
if (trackData == null) {
cache.put(key, new WeakReference<TrackData>(key));
} else {
track.setTrackData(trackData.get());
}
}
}
So when I load a track, I call TrackDataCache.cache()
and if its track data was not loaded before, it is cached or replaced with cached copy otherwise (TrackData
overrides equals() method to check for location and subsong index). I want to use weak references so that I don't need to care when I remove Tracks.
I wanted to ask if it is an ok practice to keep weak reference to the key in WeakHashMap, and if not, how should I approach this problem? I need weak references and constant time retrieving of cached values. I was thinking of copying WeakHashMap code and making getEntry()
method public, which solves the problem but it's such a bad hack :(
PS. I understand that either apache or google collections may have something like this, but I really don't want to add 2Mb dependencies.
Upvotes: 3
Views: 1307
Reputation: 66166
I'd recommend to replace WeakReferences
with SoftReferences
.
Any objects which is referenced only by a WeakReference
is a target for every round of the garbage collector. It means that your cache can be cleared even it there's still enough free memory.
If you replace WeakReference
with SoftReference
then you state: Remove referenced object only when there's absolutely no free memory to allocate.
There's no ready-to-use SoftHashMap
implementation in java. There is a good example in guava - MapMaker
. It's worth to use this well-tested and verified on production environments code and not to provide your own definitely less quality implementation. It also has amazing mechanism of 'self-cleaning':
As the map size grows close to the maximum, the map will evict entries that are less likely to be used again. For example, the map may evict an entry because it hasn't been used recently or very often.
expireAfterWrite
and expireAfterAccess
methods.I also find your cache design not very convenient. As I understand from your code snippet, from start your Track
s have strong references to their TrackData
and you build your cache upon these circumstances. But from some moment you want to use your cache for retreiving data so you'll have to create new Track
s in some other way because from that moment you want to use cache but not strong references.
Different Tracks
can have the same TrackData
so we can't use Track
as a key. So, I'd go with the next approach:
Map<Integer, TrackData>
with soft values and defined self-cleaning strategy (based on MapMaker
);Track --> TrackData
to Track --> Id (int)
. Cache Id --> TrackData
.Upvotes: 2
Reputation: 13620
TrackData
can be shared by many instances of Track
. We need to have a key system that doesn't require TrackData
to obtain the same instance for several Track
.
public class Track [
@Override
public int hashcode() {
... make hashcode that will be the same for
... tracks sharing the same track data.
}
@Override
public boolean equals() {
... ensure that if A.hashcode == B.hashcode then A.equals(B)
}
}
public class TrackDataManager {
private WeakHashMap<Track,TrackData> cache = new WeakHashMap<Track,TrackData>();
public TrackData getTrackData(Track track) {
// Track.hashcode()/equals() ensures two tracks that
// share track data will get the same object back
TrackData data = cache.get(track);
if (data == null) {
data = constructDataFromTrackFile(track);
cache.put(track, data);
}
return data;
}
private TrackData constructDataFromTrackFile(Track track) {
... read data from file and create that object.
}
}
If the construction of the TrackData
object is always going to happen as part of reading the file, but the created instance is being thrown away in favour of the shared instance, I'd model that like this:
public class TrackData {
@Override
public int hashcode() {
... make hashcode that will be the same for same track data.
}
@Override
public boolean equals() {
... ensure that if A.hashcode == B.hashcode then A.equals(B)
}
}
public class TrackDataCache {
private WeakHashMap<Integer,TrackData> cache = new WeakHashMap<Integer,TrackData>();
public TrackData getTrackData(Track track) {
// cache contains shared TrackData instances, we may throw away
// the Track instance in favour of the shared one.
Integer key = track.getTrackData().hashcode();
TrackData data = cache.get(key);
if (data == null) {
cache.put(key, track.getTrackData());
data = track.getTrackData();
} else {
// ensure we're using the shared instance, not the local one.
// deliberate object reference comparison
if (data != track.getTrackData()) {
track.setTrackData(data);
}
}
return data;
}
}
Notice that the WeakHashMap
will not do anything in any of the two solutions as long as there are Track
objects alive keeping references to the TrackData
. This could be fixed by making WeakReference
inside Track
- however that also means you can end up not having any TrackData
, and need to read it back from file, in which case the first solution is better modelled than the second.
Upvotes: 1