Reputation: 3909
I'm creating a tool that would track the memory and cpu usage or a particular process of choice. To make the tool precise I wanted need to track the resources closely, probing for them every, 50ms let's say.
An assumption was made that anything happening inside a modern computer (excluding network IO, user input) will not take more time than that (if it's not a big computation it should be very fast). I was wrong, and the Get-WMIObject CMD'let takes way more to return results, please compare:
time { Get-WMIObject Win32_Process -Filter "ProcessId='24380'"} -Samples 10 -Silent
# ..........
# Avg: 190.5335ms
# Min: 180.3689ms
# Max: 203.6968ms #>
-
time { get-process -name *chrome* 2> $null } -Samples 10 -Silent
# ..........
# Avg: 5.2ms
# Min: 3.7711ms
# Max: 13.5161ms #>
Questions: why is it so slow and what can I do about it (the main motivation being to use private working set and many other metrics that Get-WMIObject provides inside my tool). Would interacting with Win32_PerfRawData_PerfProc_Process
from a C/C++/C# utility be a good alternative to using it in Powershell?
Upvotes: 2
Views: 3825
Reputation: 159
Not to sound pessimistic or anything, but in powershell there's slow and there's slower. Use measure-command to find the time it takes for "stuff" to run. Usage: Measure-command { Get-WMIObject Win32_Process -Filter "ProcessId='24380'"}
The "modern" way of interacting with the wmi store is to use CIM instances. See if that's faster for you using the time method you're using or measure-command. Example:/ Measure-Command {Get-Ciminstance -classname Win32Process -filter "processID='24380'"}
Upvotes: 2