Reputation: 12279
I think latency refers to execution "speed" when bounded by some time constant (this function cannot take more than X milliseconds to finish execution), but I don't really understand the different between both. Doesn't a faster function have a lower latency? Doesn't lowering the latency increases its speed? Doesn't those concepts imply each other?
I have tried reading definitions of both concepts but haven't really get it yet, so, in order to understand better the difference between both, could you provide a real-world problem where (and why):
Also, I have the feeling that both concepts are used with slightly different meanings in the world of networking and traditional "execution speed" (in high-frequency trading for example). Is that right?
Upvotes: 0
Views: 336
Reputation: 207475
I understand "latency" to mean "how long before a system starts delivering", whereas I understand "speed" to mean throughput per second. Sometimes you can't improve latency - it takes an elephant 18 months to produce a baby elephant, adding more mother elephants will allow you to make more baby elephants in 18 months but the first one will still take 18 months.
I guess another way of thinking about it is in terms of the units - or "dimensional analysis". I would expect latency to be measured in seconds or milliseconds, whereas I would expect speed to be measured in items/second.
Upvotes: 1