Ajay R
Ajay R

Reputation: 21

Stress testing a command-line application

I have a command line perl script that I want to stress test. Basically what I want to do is to run multiple instances of the same script in parallel so that I can figure out at what point our machine becomes unresponsive.

Currently I am doing something like this:

$ prog > output1.txt 2>err1.txt & \
  prog > output2.txt 2>err2.txt &
  . 
  .
  .
  .

and then I am checking ps to see which instances finished and which didn't. Is there any open-source application available that can automated this process? Preferably with a web-interface?

Upvotes: 2

Views: 2146

Answers (3)

Ole Tange
Ole Tange

Reputation: 33740

With GNU Parallel this will run one prog per CPU core:

seq 1 1000 | parallel prog \> output{}.txt 2\>err{}.txt

If you wan to run 10 progs per CPU core do:

seq 1 1000 | parallel -j1000% prog \> output{}.txt 2\>err{}.txt

Watch the intro video to learn more: http://www.youtube.com/watch?v=OpaiGYxkSuQ

Upvotes: 1

Benjamin Measures
Benjamin Measures

Reputation: 31

You can use xargs to run commands in parallel:

seq 1 100 | xargs -n 1 -P 0 -I{} sh -c 'prog > output{}.txt 2>err{}.txt'

This will run 100 instances in parallel.

For a better testing framework (including parallel testing via 'spawn') take a look at Expect.

Upvotes: 3

Jon
Jon

Reputation: 2542

Why not use the crontab or Scheduled Tasks to automatically run the script?

You could write something to automatically parse the output easily.

Upvotes: 0

Related Questions