Reputation: 1366
Using any tools which you would expect to find on a nix system (in fact, if you want, msdos is also fine too), what is the easiest/fastest way to calculate the mean of a set of numbers, assuming you have them one per line in a stream or file?
Upvotes: 13
Views: 2135
Reputation: 8185
Ruby one liner
cat numbers.txt | ruby -ne 'BEGIN{$sum=0}; $sum=$sum+$_.to_f; END{puts $sum/$.}'
Upvotes: 0
Reputation: 611
Using "st" (https://github.com/nferraz/st):
$ st numbers.txt
N min max sum mean sd
10.00 1.00 10.00 55.00 5.50 3.03
Specify an option to see individual stats:
$ st numbers.txt --mean
5.5
(DISCLAIMER: I wrote this tool :))
Upvotes: 2
Reputation: 2919
In Powershell, it would be
get-content .\meanNumbers.txt | measure-object -average
Of course, that's the verbose syntax. If you typed it using aliases,
gc .\meanNumbers.txt | measure-object -a
Upvotes: 1
Reputation: 48290
awk ' { n += $1 }; END { print n / NR }'
This accumulates the sum in n
, then divides by the number of items (NR
= Number of Records).
Works for integers or reals.
Upvotes: 17
Reputation: 109022
perl -e 'while (<>) { $sum += $_; $count++ } print $sum / $count, "\n"';
Upvotes: 3
Reputation: 40309
Perl.
@a = <STDIN>;
for($i = 0; $i < #@a; $i++)
{
$sum += $a[i];
}
print $a[i]/#@a;
Caveat Emptor: My syntax may be a little whiffly.
Upvotes: 0