Reputation: 209
I am working on a project that is collecting data about performance, like the Performance Monitor does.
However, when I running a monitor on pages/sec, it is givning a different result than the Performance Monitor. I am thinking it is because the performance counter not giving all the decimals, and the average calculation becomes inaccurate.
My code UPDATED:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Diagnostics;
using System.Threading;
using System.Net;
using System.Management;
using System.Net.NetworkInformation;
namespace PerformanceMonitor
{
class Program
{
static void Main(string[] args)
{
List<float> pagesSec = new List<float>();
PerformanceCounter memoryPages = new PerformanceCounter("Memory", "Pages/sec");
while (count < 50)
{
pagesSecValue = memoryPages.NextValue();
pagesSec.Add(pagesSecValue);
Console.WriteLine("Pages: " + pagesSecValue);
count++;
Thread.Sleep(1000);
Console.Clear();
}
Console.WriteLine("Avg pages/sec: " + pagesSec.Average());
Console.ReadLine();
}
}
}
While running the program, most of the time I get 0 printed on the console.
Results: My program: 4,06349 Windows Performance Monitor: 12,133
Why the difference?
Upvotes: 2
Views: 979
Reputation: 35881
You're doing different calculations than the performance counter. What you're doing is getting the pages per second once a second for 50 seconds and getting the average of those 50 numbers. One, clearly the performance counter is working with data for a longer period of time. Two, this isn't a useful average. The performance counter is effectively taking a much higher sample. For example, what do you think would happen if the pages per second values did this over a period of 2 seconds: 0 .5 1 1.5 2 5 15 6 20 4
And your code sampled at 0, 1 and 2 seconds? Your "average" would be 5 and the performance counter (if it sampled at .5 seconds, which it doesn't) would be 10.
Upvotes: 2