Reputation: 208
I have script.sh
that sends a request to some URL and prints the execution time using whet
. Here is the source code:
time wget http://some.url
When I run it from command line, I see time result with an accuracy of 3 decimal places:
real 0m0.584s
user 0m0.000s
sys 0m0.002s
But, when I run this SH using PHP function exec("script.sh", $output)
- , I see time result with an accuracy of only 2 decimal places:
0.00user 0.00system 0:00.32elapsed 0%CPU (0avgtext+0avgdata 3788maxresident)k
How can I get the same result, as I do see in command line?
Upvotes: 2
Views: 196
Reputation: 42694
The answer is in the manual:
Note: some shells (e.g., bash(1)) have a built-in time command that provides less functionality than the command described here. To access the real command, you may need to specify its pathname (something like /usr/bin/time).
You're getting the increased precision from the bash
builtin; if you run /usr/bin/time
from the shell you will see two decimal places as well.
Unfortunately, time
is a special case, in that it's a keyword and not actually a builtin command. Otherwise bash's builtin
command could be used to force it.
So it looks like you're stuck with two decimal places unless you want to try a different method:
start_time=`date +%s%N`
wget http://some.url
end_time=`date +%s%N`
bc -l <<< "scale=4;($end_time - $start_time) / 1000000000"
Upvotes: 2