Reputation:
I ran a benchmark example and got this table.
BenchmarkDotNet=v0.12.0, OS=Windows 7 SP1 (6.1.7601.0)
Intel Xeon CPU E5-4660 v3 2.10GHz, 1 CPU, 28 logical and 14 physical cores
Frequency=2050214 Hz, Resolution=487.7540 ns, Timer=TSC
[Host] : .NET Framework 4.8 (4.8.4018.0), X86 LegacyJIT [AttachedDebugger]
DefaultJob : .NET Framework 4.8 (4.8.4018.0), X86 LegacyJIT
| Method | Mean | Error | StdDev |
|------- |----------:|---------:|---------:|
| Sha256 | 173.60 us | 3.466 us | 9.604 us |
| Md5 | 29.95 us | 0.599 us | 1.709 us |
Well... How to read it?
What is the actual meaning of [ Mean | Error | StdDev ] ?
I'm new to this...
I can't find any reference for this..
Anyone can provide a link that explains this?
Upvotes: 18
Views: 6309
Reputation: 64
All Legends:
Mean : Arithmetic mean of all measurements
Error : Half of 99.9% confidence interval
StdDev : Standard deviation of all measurements
Median : Value separating the higher half of all measurements (50th percentile)
Ratio : Mean of the ratio distribution ([Current]/[Baseline])
Rank : Relative position of current benchmark mean among all benchmarks (Arabic style)
Gen 0 : GC Generation 0 collects per 1000 operations
Gen 1 : GC Generation 1 collects per 1000 operations
Allocated : Allocated memory per single operation (managed only, inclusive, 1KB = 1024B)
1 us : 1 Microsecond (0.000001 sec)
Upvotes: 1
Reputation: 23268
You can get this information from author of BenchmarkDotNet
blog post
Mean : Arithmetic mean of all measurements
Error : Half of 99.9% confidence interval
StdDev : Standard deviation of all measurements
Upvotes: 11