Reputation: 4217
Average network timings on chrome i executed emulating chrome's regular 3G network:
gzip
Non compressed
I have calculated "Data Receive Time" as difference between Time and Latency because as per their definitions: Time is total duration, from the start of the request to the receipt of the final byte in the response. Latency is the time to load the first byte in the response.
I have few things that are not clear:
I was assuming since client would receive compressed data, Un-Compress it and then render it. So this should have taken more time.
Without compression browser had to just receive data and render that.
So with compression we have one extra step of uncompressing and still time is lower. Anyone has explanation for this?
Upvotes: 0
Views: 348
Reputation: 635
I'm not sure I understand the quandary. Latency in this discussion is essentially response time; Client asked for some data, and x amount of time transpired until the first byte of the response was received. Thus, it can be thought of as an amalgamation of server process time plus network latency.
Data receive time can be thought of as network latency distributed across every packet of the response. The compressed data requires fewer network packets to transfer, which not only reduces the time to transfer because of sheer size reduction, it also reduces the effect of latency per packet because fewer packets transferred = less impact from latency overall.
So what is surprising here? Less data takes less time to transfer. The cost of decompressing that data in terms of CPU cycles is vastly lower than the amount of latency in just about any connection.
Upvotes: 1