Reputation: 1886
I'm working on a .net app and I'm using an object that is composed of 5 Lists and 1 Hashtable. This object is used within a loop that iterates at least 500 times to run some analysis. On each loop this object should start empty, so I was wondering if it's more efficient to call clear on all Lists and Hashtable or should I just re-initialize the object?
I know I could write code to benchmark this, but I'm wondering if someone has already been down this path?
Thanks.
Upvotes: 0
Views: 1722
Reputation: 7949
While I agree with the other answer that this is a micro optimization, in the interests of answering the question, I have found that new List is slightly faster than Clear when using a List. Here's my benchmark code:
static void Main(string[] args)
{
var start = DateTime.Now;
List<string> lst = new List<string>();
for (int i = 0; i < 3000; ++i)
{
//lst = new List<string>();
lst.Clear();
for (int j = 0; j < 500; ++j)
{
lst.Add(j.ToString());
}
}
Console.WriteLine("{0} ms", ((DateTime.Now - start).Ticks / TimeSpan.TicksPerMillisecond));
Console.ReadLine();
}
Over five runs, new List averaged 340.8 ms, and Clear averaged 354.8 ms.
However, this results are so close that it's clear that:
Upvotes: 6
Reputation: 1500485
The cost of creating 3000 empty collections will be tiny. Unless your "analysis" is really trivial, this isn't going to be significant at all. Write the clearest code you can - which is likely to be creating a new set of collections each time rather than reusing them. You should only reuse an object if the logical operation is to reuse it.
Once you've written the code in the most readable way, test whether it performs as well as you need it to. If it doesn't, then you can start micro-optimizing.
I would, however, strongly recommend that you use Dictionary<,>
instead of Hashtable
.
Upvotes: 6