Reputation: 611
I'm running a complex computation (a Markov chain model)
let memoize f =
let cache = new ConcurrentDictionary<'key,'value>()
cache, fun x -> cache.GetOrAdd(x, Func<'key, 'value>f)
to cache intermediate results of multiple functions. The overall structure is something like this
module Foo =
[...]
let _, foo' = memoize foo
module Bar =
[...]
let _, bar' = memoize bar
module Main =
open Foo
open Bar
[...]
let result =
foobar (foo' a) (bar' b)
Typically I run this once and then the program terminates, but it's obviously not nice to not clean up those cache dictionaries. Also, I sometimes need to call the model for many different inputs and then I quickly run into memory issues. What's the best way to clean up multiple caches at once?
Edit
A mentioned in the comments, it would of course be possible to collect all caches into a list. But I'd have to box the dictionaries and it doesn't seem nice to me. Is there a better (overall) strategy?
Upvotes: 0
Views: 73
Reputation: 5004
This would be my suggestion, simple and effective:
module Foo =
[...]
let fcache, foo' = memoize foo
module Bar =
[...]
let bcache, bar' = memoize bar
module Main =
open Foo
open Bar
let clearCaches = [
fcache.Clear
bcache.Clear
]
[...]
let result =
foobar (foo' a) (bar' b)
let clearAll() =
clearCaches |> Seq.iter (fun clear -> clear())
Update
If you wanted to collect the clear
functions automatically the memoize function could do it, like this:
let clearCaches = Dictionary<_,_>()
let memoize (name:string) f =
let cache = new ConcurrentDictionary<'key,'value>()
clearCaches.Add(name, cache.Clear)
fun x -> cache.GetOrAdd(x, Func<'key, 'value>f)
module Foo =
[...]
let foo' = memoize "Foo.foo" foo
module Bar =
[...]
let bar' = memoize "Bar.bar" bar
module Main =
open Foo
open Bar
[...]
let result =
foobar (foo' a) (bar' b)
let clearAll() =
clearCaches |> Seq.iter (fun kvp -> kvp.Value())
Which would also allow you to clear them individually or using certain conditions, like by module, etc.
Upvotes: 1
Reputation: 6510
I would suggest using a more robust caching structure than ConcurrentDictionary
so that you can specify an expiration policy. Here's one on FSSnip that wraps ConcurrentDictionary
and allow for time-based expiration, but you could add expiration based on other criteria. This would allow you to just use memoizeWithExpiration
without having to worry about clean-up on the calling side.
Upvotes: 1