Reputation: 771
Curiously as I am, I wrote a small program that writes one space into a text file, then 2, then 4, etc. I record the time it needs to do so for each loop and of course it exponencially extends. While it's just about 0.003 seconds at the beginning, it gets to the minute mark really fast. Now I want to calculate the estimated time for the program to finish.
This is the code I use so far:
//This creates the file if it doesn't exist
File.AppendAllText("C:/1G.txt", "");
//I am starting with 30 iterations
for (int i = 0; i < 30; i++)
{
DateTime start = DateTime.Now;
//The 1<<i will loop 1, 2, 4, 8, etc. times
for (int j = 0; j < 1 << i; j++)
{
File.AppendAllText("C:/1G.txt", " ");
}
DateTime end = DateTime.Now;
Console.WriteLine($"i = {i}, 1<<i = {1 << i}, Time: {(end-start)}");
}
Now normally when you try to calculate an estimated time, you take the time spans you already needed for each task, sum them up and divide them by the number of timestamps you have. But here this is not possible, as we can be sure that the next iteration will take longer than the first one.
Now I could just double the time for each iteration and have the time it can take. But my "problem" is, that it's not doubling 100% (which would be impossible):
Time: 00:00:00.0150
Time: 00:00:00.0020
Time: 00:00:00.0010
Time: 00:00:00.0020
Time: 00:00:00.0060
Time: 00:00:00.0090
Time: 00:00:00.0850
Time: 00:00:00.0708
Time: 00:00:00.3261
Time: 00:00:00.6483
Time: 00:00:01.0382
Time: 00:00:02.1114
Time: 00:00:02.4375
Time: 00:00:04.3125
Time: 00:00:09.0887
Time: 00:00:17.9730
...
How could I calculate a vague estimated time for this case?
Upvotes: 0
Views: 424
Reputation: 303
Are you trying to prove that bad practices can impact your code performance? If not and you really want to measure execution time first of all try to use Stopwatch for time measuring (create it once and reset after internal loop execution finishes) - it's much better for measuring duration than comparing DateTime.Now
s. In the next place, by using File.AppendAllText
you're opening and closing Stream to a file with every method invocation. It would be much better to actually open the stream once, write the data you want and close it once after. Could you elaborate about what are you actually trying to achieve, because I can't really understand what are you asking about in the first place. You're doing exponentially more work so with your implementation the time also raises exponentially. If I get this right you want to get average invocation time of writing spaces to a file once. To do this you have to compensate for number of samples. I'd implement it the following way:
static void Main()
{
var stopwatch = new Stopwatch();
var samples = new double[30];
for (var i = 0; i < 30; i++)
{
stopwatch.Start();
// File.OpenWrite creates the file if it doesn't exist
// Move these usings outside of the loop if you don't want to measure opening/closing stream to file
using (var fileStream = File.OpenWrite("D:\\1G.txt"))
using (var streamWriter = new StreamWriter(fileStream))
{
// Option A
// This will create a string with desired number of spaces,
// no internal loop necessary, but allocates a lot of memory
streamWriter.Write(new string(' ', 1 << i));
// Option B
// If you insist on creating a loop
//for (int j = 0; j < 1 << i; j++)
//{
// streamWriter.Write(' ');
//}
}
stopwatch.Stop();
var writeDurationTimeSpan = TimeSpan.FromTicks(stopwatch.ElapsedTicks);
var writeDurationInMs = writeDurationTimeSpan.TotalMilliseconds;
var singleSpaceWriteDuratonInMs = writeDurationTimeSpan.TotalMilliseconds / (1 << i);
samples[i] = singleSpaceWriteDuratonInMs;
Console.WriteLine("i = {0}, 1<<i = {1}, Execution duration: {2} ms, Single space execution duration: {3} ms",
i,
1 << i,
writeDurationInMs,
singleSpaceWriteDuratonInMs.ToString("F20").TrimEnd('0')
);
stopwatch.Reset();
}
Console.WriteLine("Average single space invocation time: {0} ms",
samples.Average().ToString("F20").TrimEnd('0')
);
}
By the way I really recommend using BenchmarkDotNet for benchmarks, execution time measuring etc. Do give it a try - it's a fantastic library.
Upvotes: 1