Jothsna Nalla
Jothsna Nalla

Reputation: 189

calculating the average download time

Can You help me in understanding the following? Thanks in advance! :)

Given T=F/C.----(1)  

where T is the average download time, F is the file size and C is the average service capacity.

The average capacity that the downloading peer expects from the network is

(100 +150)/2 = 125kbps. 

If the file size F is 1MB, we predict that the average download time is 64 seconds from (1).

plzzzzzzzz explain this ....

Upvotes: 0

Views: 6680

Answers (1)

Matt McDonald
Matt McDonald

Reputation: 5050

Well, if the server has bandwidth ("capacity") to allow for 100kb/s (here I'm meaning kilobytes per second not bits) and the file is 1MB (again megabytes not bits), then the time to download it would be 1024 (1024 kilobytes = 1MB) / 100 so the file would take 10.24 seconds to download.

So T (time) = filesize (F) / available bandwidth (C)

If you were wanting to know capacity, not time, you could rearrange the formula as C = F/T which would tell you the capacity for future download requests.

This formula could only be an estimate, as real world times would depend on how the server was operating at the time.

However, this takes account of server bandwidth, not user bandwidth.

If you were really wanting to know the accurate average speed of a download, you should account for both - only there is only point to this if it's a large download you're going to be providing, otherwise you're just wasting time calculating the speed.

But, to make it more accurate, run a test download against the user to see their average download speed, then use either your static average download speed of your server (or better yet your calculated average download speed of server) or the user average download speed, depending on which one is slower, and use that as your capacity figure.

Upvotes: 4

Related Questions