Reputation: 6279
I seem to be getting different results in testing whether the content is served gzipped via Apache's mod_deflate
.
I test the same URL and on one PC I have Content-Encoding:gzip
present in response headers, on a different PC that has almost identical set up (version of Windows, browser versions, etc) I do not and the page loads more slowly and has a bigger size. The request header Accept-Encoding:gzip, deflate, lzma, sdch, br
is present in both cases.
Upvotes: 3
Views: 365
Reputation: 9818
Can someone recommend a reliable way of testing that the content is gzipped? By reliable I mean - other than Chrome Dev Tools.
[Answer] - A reliable way to test would be to find out the size of the content retrieved from server and compare with the content length received. As per this post (content-length when using http compression), the content length will be the size of compressed content.
If the content is text, the compression ratio will be 3:1 or more. However if you are sending a compressed image it will be lot less. In any case if the compression is working, it will be greater than 1:1.
For test setup, you can host a set of test data like text files, images on apache with knowledge on their sizes. May be the name of the file could be their size like 1024bytes.txt. On client side, you can send a request to retrieve the data and compare the response headers (Content-Length) to check the size of the content. You can automate this using tools such as mocha & chai.
Why is it that some users may have the full content even though they request for a gzipped one? Is it server-related or client-related? How can I ensure that 100% of my users get gzipped content?
[Answer] - You can only ensure this from server side. There will be client side limitations and bugs. For e.g. https://support.microsoft.com/en-us/help/871205/internet-explorer-may-not-decompress-http-content-when-you-visit-a-web-site. As you can see, a windows bug can cause compression to fail. This or a similar bug can probably explain why you see a different behavior on a different machine.
How do we fix this?
You can compare your test setup with a reliable data provider. In your case, you did verify it with www.bing.com. Once you verify that your test setup and client is working correctly with a reliable source, test it with your apache server and certify.
Upvotes: 2
Reputation: 9343
Assuming the issue is a caching proxy as discussed in the comments.
There are a couple of options.
Apache can add http headers to the content which informs caches not to store content. Put the following into your VirtualHost block
Header set Cache-Control "private"
No promises that the above will work, it's an advisory which may be ignored by a proxy.
A more reliable way, would be to use HTTPS, this prevents almost all proxies from reading anything other than the host name so it's impossible for them to act as a cache. SSL certificates are pretty cheap, but you could test for free using a self signed certificate first.
There are instructions for creating a self signed cert here Then copy your VirtualHost block and paste a new one changing the port to 443 and adding the following:
<VirtualHost *:443>
.... Existing config
SSLEngine on
SSLCertificateFile /path/to/your_domain_name.crt
SSLCertificateKeyFile /path/to/your_private.key
</VirtualHost>
As a final point, I would note that if an organisation has implemented a caching proxy. The purpose of that is possibly to improve performance for their users. The best way to know for sure if a technical change is improving speed for end users is to measure it. The Chrome developer tools include a network monitor which include page load timings and lots of other interesting details.
Upvotes: 1