Reputation: 409
If I run top -p $(pgrep -d',' scrapy) I get information on the scrapy process, but this process probably triggers other python related processes. How can I get information on these processes as well in real time as the top command does? Thanks,
Dani
Upvotes: 2
Views: 463
Reputation:
What you're looking for is a program or script that will gather the CPU usage of all child processes spawned by scrapy.
If you wanted to script this yourself, you could look at the output of ps -p {scrapy pid} -L
to get all the threads spawned by the instantiation of scrapy.
Or, you could chain together a couple Linux commands to have a one-liner:
ps -C scrapy -o pcpu= | awk '{cpu_usage+=$1} END {print cpu_usage}'
ps:
-C
specifies the command name to output
-o pcou=
tells ps
to only display cpu usage
awk:
{cpu_usage+=$1} END
loops over the response from ps
{print cpu_usage}
will send the sum to STDOUT.
Upvotes: 3