Reputation: 1186
I am making a Monte Carlo algorithm and I want to display its progress in a progress bar (values from 0% to 100%)
My initial thought is to compare the standard deviation generated by the algorithm with the solution tolerance specified.
like
progress = 100 * specified_tolerance / standard_deviation
However I wonder if there is something better, or if my approach has some pitfall.
[EDIT]
A sample picture of the simulations I'm making:
Thanks
Upvotes: 1
Views: 109
Reputation: 20130
Well, problem, I think, with your solution is that std.dev is going down as inverse square root of N
(number of events generated), and assuming that N
is proportional to simulation time, your scale would behave like
progress = C * sqrt(t)
which is quite unnatural if you ask me
I would redo scale to be linear, which means dealing with squared sigma (or variance)
UPDATE
thinking about it, I would do both. Typically you have green/blue bar and some numbers displaying % on top of that. I would separate the two indicators, and make progress bar linear (dealing with variance), but percent display dealing with std.dev and therefore, being sqrt(). It would look a bit weird, but might be the best of two worlds
Upvotes: 2