Reputation: 2166
Is there a System.Lazy<T>
without exception caching? Or another nice solution for lazy multithreading initialization & caching?
I've got following program (fiddle it here):
using System;
using System.Collections.Concurrent;
using System.Threading;
using System.Threading.Tasks;
using System.Net;
namespace ConsoleApplication3
{
public class Program
{
public class LightsaberProvider
{
private static int _firstTime = 1;
public LightsaberProvider()
{
Console.WriteLine("LightsaberProvider ctor");
}
public string GetFor(string jedi)
{
Console.WriteLine("LightsaberProvider.GetFor jedi: {0}", jedi);
Thread.Sleep(TimeSpan.FromSeconds(1));
if (jedi == "2" && 1 == Interlocked.Exchange(ref _firstTime, 0))
{
throw new Exception("Dark side happened...");
}
Thread.Sleep(TimeSpan.FromSeconds(1));
return string.Format("Lightsaver for: {0}", jedi);
}
}
public class LightsabersCache
{
private readonly LightsaberProvider _lightsaberProvider;
private readonly ConcurrentDictionary<string, Lazy<string>> _producedLightsabers;
public LightsabersCache(LightsaberProvider lightsaberProvider)
{
_lightsaberProvider = lightsaberProvider;
_producedLightsabers = new ConcurrentDictionary<string, Lazy<string>>();
}
public string GetLightsaber(string jedi)
{
Lazy<string> result;
if (!_producedLightsabers.TryGetValue(jedi, out result))
{
result = _producedLightsabers.GetOrAdd(jedi, key => new Lazy<string>(() =>
{
Console.WriteLine("Lazy Enter");
var light = _lightsaberProvider.GetFor(jedi);
Console.WriteLine("Lightsaber produced");
return light;
}, LazyThreadSafetyMode.ExecutionAndPublication));
}
return result.Value;
}
}
public void Main()
{
Test();
Console.WriteLine("Maximum 1 'Dark side happened...' strings on the console there should be. No more, no less.");
Console.WriteLine("Maximum 5 lightsabers produced should be. No more, no less.");
}
private static void Test()
{
var cache = new LightsabersCache(new LightsaberProvider());
Parallel.For(0, 15, t =>
{
for (int i = 0; i < 10; i++)
{
try
{
var result = cache.GetLightsaber((t % 5).ToString());
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
Thread.Sleep(25);
}
});
}
}
}
Basically I want to cache produced lightsabers, but producing them is expensive and tricky - sometimes exceptions may happen. I want to allow only one producer at time for given jedi
, but when exception is thrown - I want another producer to try again. Therefore, desired behavior is like System.Lazy<T>
with LazyThreadSafetyMode.ExecutionAndPublication
option, but without exceptions caching.
All in all, following technical requirements must be meet:
In my example:
Upvotes: 21
Views: 6587
Reputation: 23898
As per https://stackoverflow.com/a/42567351/34092 :
This solution is superior to some of the answers here (including the one marked as the accepted answer as at Oct 2024), since many of the answers in this question hold a reference to the original delegate (the Func
passed in) even after the value is successfully constructed.
This solution is going to behave similarly to @TheodorZoulias' LazyWithRetry
excellent solution - with one major difference. With his solution:
Arguably a better behavior is to propagate the error of the valueFactory to all threads that are currently waiting. This way no thread is going to wait for a response for longer than the duration of a single valueFactory invocation.
But with my (this) solution, there will be no exception caching at all. So if there are multiple callers to Value
, no caller will see an exception that was initially raised by another caller. The downside of this is that calling Value
could take a while if valueFactory
is slow and repeatedly throws exceptions.
using System;
using System.Threading;
namespace ADifferentLazy
{
/// <summary>
/// Basically the same as Lazy with LazyThreadSafetyMode of ExecutionAndPublication, BUT exceptions are not cached
/// </summary>
public class LazyWithNoExceptionCaching<T>
{
private Func<T> valueFactory;
private T value = default(T);
private readonly object lockObject = new object();
private bool initialized = false;
private static readonly Func<T> ALREADY_INVOKED_SENTINEL = () => default(T);
public LazyWithNoExceptionCaching(Func<T> valueFactory)
{
this.valueFactory = valueFactory;
}
public bool IsValueCreated
{
get { return initialized; }
}
public T Value
{
get
{
//Mimic LazyInitializer.EnsureInitialized()'s double-checked locking, whilst allowing control flow to clear valueFactory on successful initialisation
if (Volatile.Read(ref initialized))
return value;
lock (lockObject)
{
if (Volatile.Read(ref initialized))
return value;
value = valueFactory();
Volatile.Write(ref initialized, true);
}
valueFactory = ALREADY_INVOKED_SENTINEL;
return value;
}
}
}
}
Upvotes: 3
Reputation: 43545
The two existing answers by vernou and tsul (AtomicLazy<T>
and SimpleLazy<T>
respectively) solve sufficiently this problem, but they both exhibit a behavior that is not entirely to my liking. In case the valueFactory
fails, all the threads that are currently in sleep mode waiting for the Value
, will retry again the valueFactory
one by one. This means that if for example 100 threads request the Value
at the same time, and the valueFactory
takes 1 second before it fails, the valueFactory
will be invoked 100 times, and the last thread in the list will wait for 100 seconds before getting the exception.
Arguably a better behavior is to propagate the error of the valueFactory
to all threads that are currently waiting. This way no thread is going to wait for a response for longer than the duration of a single valueFactory
invocation. Below is an implementation of a LazyWithRetry<T>
class with this behavior:
/// <summary>
/// Represents the result of an action that is invoked lazily on demand, and can be
/// retried as many times as needed until it succeeds, while enforcing a
/// non-overlapping execution policy.
/// </summary>
/// <remarks>
/// In case the action is successful, it is never invoked again. In case of failure
/// the error is propagated to the invoking thread, as well as to all other threads
/// that are currently waiting for the result. The error is not cached. The action
/// will be invoked again when the next thread requests the result, repeating the
/// same pattern.
/// </remarks>
public class LazyWithRetry<T>
{
private volatile Lazy<T> _lazy;
public LazyWithRetry(Func<T> valueFactory)
{
ArgumentNullException.ThrowIfNull(valueFactory);
T GetValue()
{
try { return valueFactory(); }
catch { _lazy = new(GetValue); throw; }
}
_lazy = new(GetValue);
}
public T Value => _lazy.Value;
}
A demonstration of the LazyWithRetry<T>
class can be found here. Below is a sample output of this demo:
20:13:12.283 [4] > Worker #1 before requesting value
20:13:12.303 [4] > **Value factory invoked
20:13:12.380 [5] > Worker #2 before requesting value
20:13:12.481 [6] > Worker #3 before requesting value
20:13:12.554 [4] > --Worker #1 failed: Oops! (1)
20:13:12.555 [5] > --Worker #2 failed: Oops! (1)
20:13:12.555 [6] > --Worker #3 failed: Oops! (1)
20:13:12.581 [7] > Worker #4 before requesting value
20:13:12.581 [7] > **Value factory invoked
20:13:12.681 [8] > Worker #5 before requesting value
20:13:12.781 [9] > Worker #6 before requesting value
20:13:12.831 [7] > --Worker #4 failed: Oops! (2)
20:13:12.831 [9] > --Worker #6 failed: Oops! (2)
20:13:12.832 [8] > --Worker #5 failed: Oops! (2)
20:13:12.881 [10] > Worker #7 before requesting value
20:13:12.881 [10] > **Value factory invoked
20:13:12.981 [11] > Worker #8 before requesting value
20:13:13.081 [12] > Worker #9 before requesting value
20:13:13.131 [10] > --Worker #7 received value: 3
20:13:13.131 [11] > --Worker #8 received value: 3
20:13:13.132 [12] > --Worker #9 received value: 3
20:13:13.181 [13] > Worker #10 before requesting value
20:13:13.181 [13] > --Worker #10 received value: 3
20:13:13.182 [1] > Finished
And below is a sample output of the same demo, when using either an AtomicLazy<T>
or a SimpleLazy<T>
class:
20:13:38.192 [4] > Worker #1 before requesting value
20:13:38.212 [4] > **Value factory invoked
20:13:38.290 [5] > Worker #2 before requesting value
20:13:38.390 [6] > Worker #3 before requesting value
20:13:38.463 [5] > **Value factory invoked
20:13:38.463 [4] > --Worker #1 failed: Oops! (1)
20:13:38.490 [7] > Worker #4 before requesting value
20:13:38.590 [8] > Worker #5 before requesting value
20:13:38.690 [9] > Worker #6 before requesting value
20:13:38.713 [5] > --Worker #2 failed: Oops! (2)
20:13:38.713 [6] > **Value factory invoked
20:13:38.791 [10] > Worker #7 before requesting value
20:13:38.891 [11] > Worker #8 before requesting value
20:13:38.963 [6] > --Worker #3 received value: 3
20:13:38.964 [8] > --Worker #5 received value: 3
20:13:38.964 [7] > --Worker #4 received value: 3
20:13:38.964 [9] > --Worker #6 received value: 3
20:13:38.964 [10] > --Worker #7 received value: 3
20:13:38.964 [11] > --Worker #8 received value: 3
20:13:38.991 [12] > Worker #9 before requesting value
20:13:38.991 [12] > --Worker #9 received value: 3
20:13:39.091 [13] > Worker #10 before requesting value
20:13:39.091 [13] > --Worker #10 received value: 3
20:13:39.091 [1] > Finished
A more advanced (memory-optimized) implementation of the LazyWithRetry<T>
class can be found in the 5th revision of this answer.
Upvotes: 11
Reputation: 7590
Actually, this feature is debated: Introduce a fourth type of LazyThreadSafetyMode: ThreadSafeValueOnly.
To wait, I use this graceful implementation from Marius Gundersen: Lazy (and AsyncLazy) don't handle exceptions very well.
public class AtomicLazy<T>
{
private readonly Func<T> _factory;
private T _value;
private bool _initialized;
private object _lock;
public AtomicLazy(Func<T> factory)
{
_factory = factory;
}
public T Value => LazyInitializer.EnsureInitialized(ref _value, ref _initialized, ref _lock, _factory);
}
Upvotes: 14
Reputation: 9
Created this class based on @piotrwest as an improvement !
internal class CustomLazy<T> where T : class
{
private readonly Func<T> _valueFactory;
private Lazy<T> _lazy;
private int _counter;
public T Value => _lazy.Value;
public CustomLazy( Func<T> valueFactory )
{
_valueFactory = valueFactory;
_counter = 0;
_lazy = new Lazy<T>(
Create,
LazyThreadSafetyMode.PublicationOnly
);
}
private T Create()
{
try
{
if( Interlocked.Increment( ref _counter ) == 1 )
{
return _valueFactory();
}
else
{
throw new InvalidOperationException( );
}
}
finally
{
Interlocked.Decrement( ref _counter );
}
}
}
Configuring Lazy instance with LazyThreadSafetyMode.PublicationOnly makes it possible to retry until you get your desired value but it also allow for multiple Create functions to be called at the same time. To counter that mechanic I've added a ref counter to allow only one valueFactory to called be at the same time. You should consider using this only where you can manage failure from Value property.
Upvotes: -3
Reputation: 5
Better way:
public class SimpleLazy<T> where T : class
{
private readonly Func<T> valueFactory;
private T instance;
public SimpleLazy(Func<T> valueFactory)
{
this.valueFactory = valueFactory;
this.instance = null;
}
public T Value
{
get
{
return LazyInitializer.EnsureInitialized(ref instance, valueFactory);
}
}
}
Upvotes: -3
Reputation: 2166
Unfortunately this is wrong solution! Please disregard it and use tsul answer. Leaving it only if you want to debug it and spot the bug.
Here is working solution (concurrent cache with factory) with tsul SimpleLazy: https://dotnetfiddle.net/Y2GP2z
I've ended up with following solution: wrapped Lazy to mimic the same functionality as Lazy but without exceptions cache.
Here is LazyWithoutExceptionsCaching class:
public class LazyWithoutExceptionCaching<T>
{
private readonly Func<T> _valueFactory;
private Lazy<T> _lazy;
public LazyWithoutExceptionCaching(Func<T> valueFactory)
{
_valueFactory = valueFactory;
_lazy = new Lazy<T>(valueFactory);
}
public T Value
{
get
{
try
{
return _lazy.Value;
}
catch (Exception)
{
_lazy = new Lazy<T>(_valueFactory);
throw;
}
}
}
}
Full working example (FIDDLE it here):
using System;
using System.Collections.Concurrent;
using System.Threading;
using System.Threading.Tasks;
using System.Net;
namespace Rextester
{
public class Program
{
public class LazyWithoutExceptionCaching<T>
{
private readonly Func<T> _valueFactory;
private Lazy<T> _lazy;
public LazyWithoutExceptionCaching(Func<T> valueFactory)
{
_valueFactory = valueFactory;
_lazy = new Lazy<T>(valueFactory);
}
public T Value
{
get
{
try
{
return _lazy.Value;
}
catch (Exception)
{
_lazy = new Lazy<T>(_valueFactory);
throw;
}
}
}
}
public class LightsaberProvider
{
private static int _firstTime = 1;
public LightsaberProvider()
{
Console.WriteLine("LightsaberProvider ctor");
}
public string GetFor(string jedi)
{
Console.WriteLine("LightsaberProvider.GetFor jedi: {0}", jedi);
Thread.Sleep(TimeSpan.FromSeconds(1));
if (jedi == "2" && 1 == Interlocked.Exchange(ref _firstTime, 0))
{
throw new Exception("Dark side happened...");
}
Thread.Sleep(TimeSpan.FromSeconds(1));
return string.Format("Lightsaver for: {0}", jedi);
}
}
public class LightsabersCache
{
private readonly LightsaberProvider _lightsaberProvider;
private readonly ConcurrentDictionary<string, LazyWithoutExceptionCaching<string>> _producedLightsabers;
public LightsabersCache(LightsaberProvider lightsaberProvider)
{
_lightsaberProvider = lightsaberProvider;
_producedLightsabers = new ConcurrentDictionary<string, LazyWithoutExceptionCaching<string>>();
}
public string GetLightsaber(string jedi)
{
LazyWithoutExceptionCaching<string> result;
if (!_producedLightsabers.TryGetValue(jedi, out result))
{
result = _producedLightsabers.GetOrAdd(jedi, key => new LazyWithoutExceptionCaching<string>(() =>
{
Console.WriteLine("Lazy Enter");
var light = _lightsaberProvider.GetFor(jedi);
Console.WriteLine("Lightsaber produced");
return light;
}));
}
return result.Value;
}
}
public static void Main(string[] args)
{
Test();
Console.WriteLine("Maximum 1 'Dark side happened...' strings on the console there should be. No more, no less.");
Console.WriteLine("Maximum 5 lightsabers produced should be. No more, no less.");
}
private static void Test()
{
var cache = new LightsabersCache(new LightsaberProvider());
Parallel.For(0, 15, t =>
{
for (int i = 0; i < 10; i++)
{
try
{
var result = cache.GetLightsaber((t % 5).ToString());
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
Thread.Sleep(25);
}
});
}
}
}
Upvotes: 4
Reputation: 1586
It's hard to use built-in Lazy for that: you should wrap your LazyWithoutExceptionCaching.Value getter in a lock. But that makes the use of the built-in Lazy
redundant: you'll have unnecessary locks inside the Lazy.Value
getter.
It's better to write your own Lazy implementation especially if you intend to instantiate reference types only, it turns to be rather simple:
public class SimpleLazy<T> where T : class
{
private readonly Func<T> valueFactory;
private T instance;
private readonly object locker = new object();
public SimpleLazy(Func<T> valueFactory)
{
this.valueFactory = valueFactory;
this.instance = null;
}
public T Value
{
get
{
lock (locker)
return instance ?? (instance = valueFactory());
}
}
}
P.S. Maybe we'll have this functionality built-in when this issue gets closed.
Upvotes: 9
Reputation: 28355
As I mentioned in comment, you can ease your code by using the TPL library's Task
object:
var resultTask = Task.Factory.StartNew(new Action<object>(
(x) => GetFor(x)), rawData);
public string GetFor(string jedi)
{
Console.WriteLine("LightsaberProvider.GetFor jedi: {0}", jedi);
Thread.Sleep(TimeSpan.FromSeconds(1));
if (jedi == "2" && 1 == Interlocked.Exchange(ref _firstTime, 0))
{
throw new Exception("Dark side happened...");
}
Thread.Sleep(TimeSpan.FromSeconds(1));
return string.Format("Lightsaver for: {0}", jedi);
}
After that, you can wait for the result of this task like this:
resultTask.Wait();
Making this will cache the result of the operation for concrete x
. If task runs correctly, you can examine the Result
property. If task fails, the Exception
property will store the AggregateException
with inner actual exception. Result
is cached and will not re-calculated. If task fails, it will throw
its exception at calling the Result
property or some other blocking methods of it. If you need a result for different argument, you should create a new task.
I encourage you to examine this library as you'll save your time for re-inventing the wheel :) Also you'll got some out-of box functionality as multithreading, exception handling, task cancellation and many many more. Good luck with your projects :)
Upvotes: -1