Reputation: 219
I need to download several files with wget
and measure download speed.
e.g. I download with
wget -O /dev/null http://ftp.bit.nl/pub/OpenBSD/4.7/i386/floppy47.fs http://ftp.bit.nl/pub/OpenBSD/4.7/i386/floppyB47.fs
and the output is
--2010-10-11 18:56:00-- http://ftp.bit.nl/pub/OpenBSD/4.7/i386/floppy47.fs
Resolving ftp.bit.nl... 213.136.12.213, 2001:7b8:3:37:20e:cff:fe4d:69ac
Connecting to ftp.bit.nl|213.136.12.213|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1474560 (1.4M) [text/plain]
Saving to: `/dev/null'
100%[==============================================================>] 1,474,560 481K/s in 3.0s
2010-10-11 18:56:03 (481 KB/s) - `/dev/null' saved [1474560/1474560]
--2010-10-11 18:56:03-- http://ftp.bit.nl/pub/OpenBSD/4.7/i386/floppyB47.fs
Reusing existing connection to ftp.bit.nl:80.
HTTP request sent, awaiting response... 200 OK
Length: 1474560 (1.4M) [text/plain]
Saving to: `/dev/null'
100%[==============================================================>] 1,474,560 499K/s in 2.9s
2010-10-11 18:56:06 (499 KB/s) - `/dev/null' saved [1474560/1474560]
FINISHED --2010-10-11 18:56:06--
Downloaded: 2 files, 2.8M in 5.9s (490 KB/s)
I need to grep the total download speed, that is, the string 490 KB/s
.
How do I do this?
P.S. May need to account for the case that we will actually download only one file, so there won't be final output starting with FINISHED
Upvotes: 7
Views: 30765
Reputation: 1
For example, get speed in MBit per second (by adding --report-speed=bits for wget, and small change grep pattern):
wget -O /dev/null --report-speed=bits http://www.ovh.net/files/10Mb.dat 2>&1 | grep -o "[0-9.,]\+ [KM]*[Bb]/s"
answer:
1,51 Mb/s
Upvotes: 0
Reputation: 2592
This works when only 1 file is being downloaded.
I started using sed
to get the speed from wget, but I found it irritating so I switched to grep.
This is my command:
wget ... 2>&1 | grep -o "[0-9.]\+ [KM]*B/s"
The -o
option means it only returns that part. It matches 1 or more of the 10 digits then a space. Then optionally K
or M
before the B/s
That will return 423 KB/s
(for example).
To grep for just the units, use grep -o "[KM]*B/s"
and for just the number use grep -o "[0123456789]\+
.
Upvotes: 2
Reputation: 15134
Update, a grep-style version using sed:
wget ... 2>&1 | sed -n '$,$s/.*(\(.*\)).*/\1/p'
Old version:
I thought, it's easier to divide the file size by the download time after the download. ;-)
(/usr/bin/time -p wget ... 2>&1 >/dev/null; ls -l newfile) | \
awk '
NR==1 {t=$2};
NR==4 {printf("rate=%f bytes/second\n", $5/t)}
'
The first awk line stores the elapsed real time of "real xx.xx" in variabe t
. The second awk line divides the file size (column 5 of ls -l
) by the time and outputs this as the rate.
Upvotes: 4
Reputation: 14810
This worked for me, using your wget -O /dev/null <resource>
The regex I used was \([0-9.]\+ [KM]B/s\)
But note I had to redirect stderr
onto stdout
so the command was:
wget -O /dev/null http://example.com/index.html 2>&1 | grep '\([0-9.]\+ [KM]B/s\)'
This allows things like 923 KB/s
and 1.4 MB/s
grep
just finds matches. To get the value(s) you can use sed
instead:
wget -O /dev/null http://example.com/index.html 2>&1 |
sed -e 's|^.*(\([0-9.]\+ [KM]B/s\)).*$|\1|'
Upvotes: 2
Reputation: 342659
here's suggestion. You can make use of wget's
--limit-rate=amount
option. For example,
--limit-rate=400k
will limit the retrieval rate to 400KB/s. Then its easier for you to
calculate the total speed. Saves you time and mental anguish trying to regex it.
Upvotes: -3
Reputation: 46965
Why can't you just do this:
perl -ne "/^Downloaded.*?\((.*?)\)/; print $1"
Upvotes: -1