Reputation: 1590
Using ASP.Net 2.0 and IIS 6
I enabled compression by extending IHttpModule and registering in the Web.config file
public class EnableCompression : IHttpModule
{
public void Init(HttpApplication application)
{
application.BeginRequest += (new EventHandler(this.Application_BeginRequest));
}
private void Application_BeginRequest(Object source, EventArgs e)
{
HttpContext context = HttpContext.Current;
String encoding = context.Request.Headers.Get("Accept-Encoding");
if (encoding == null)
return;
encoding.ToLower();
if (encoding.Contains("gzip"))
{
context.Response.Filter = new GZipStream(context.Response.Filter, CompressionMode.Compress);
HttpContext.Current.Response.AppendHeader("Content-Encoding", "gzip");
}
else
{
context.Response.Filter = new DeflateStream(context.Response.Filter, CompressionMode.Compress);
HttpContext.Current.Response.AppendHeader("Content-Encoding", "deflate");
}
}
void IHttpModule.Dispose()
{
throw new Exception("The Method or Operation is not Implemented");
}
}
The above works just fine until I also set an expires header in the page load event on my master page code file
protected void Page_Load(object sender, EventArgs e)
{
Response.Cache.SetCacheability(HttpCacheability.ServerAndPrivate);
Response.Cache.SetExpires(DateTime.Now.AddMinutes(10));
Response.Cache.SetValidUntilExpires(true);
Response.Cache.SetLastModified(DateTime.Now);
}
After setting the expires header, I get a bunch of garbage once the browser retrieves data from cache. So the first time a page is loaded, it's fine. If I go to a new page and then got back to the first page, I only get a page full of:
��`I�%&/m�{J�J��t��`$ؐ@�����iG#)�*�
If I disable compression but leave the expires header, it's fine If I disable the expires header but leave compression enabled, it's fine If I enable both, I get a page full of garbage.
I have no idea what's going on.
Response Header
HTTP/1.1 200 OK
Date: Sun, 08 Jan 2012 07:23:15 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
X-AspNet-Version: 2.0.50727
Content-Encoding: gzip
Cache-Control: private
Expires: Sun, 08 Jan 2012 07:33:00 GMT
Last-Modified: Sun, 08 Jan 2012 07:23:00 GMT
Vary: *
Content-Type: text/html; charset=utf-8
Content-Length: 13613
Request Header
GET /eng/DE/ HTTP/1.1
Host: wwwnpg
User-Agent: Mozilla/5.0 (Windows NT 5.1; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip, deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection: keep-alive
Cookie: ASP.NET_SessionId=gxdxwnnum5rumue13rxmc5mb
Upvotes: 1
Views: 1367
Reputation: 18654
You should check your HTTP headers with a web proxy like Fiddler to be sure, but I suspect what's happening is that the runtime is caching your pre-filtered content, as well as the headers you set. So the browser ends up thinking that the content is compressed, when it actually isn't.
The solution is to set your compression Filter
later in the page lifecycle, such as in the PostRequestHandlerExecute
event in your HttpModule
. I use that event to set a whitespace Filter
, and it works fine for me with the output cache (with IIS7+).
Another reason to set the Filter late in the lifecycle is to handle HTTP error pages. Otherwise, if your page throws an Exception
, the Filter
might be set, but the runtime will discard any custom headers.
You should also vary the output cache based on the Accept-Encoding
header, by setting VaryByHeader
in the OutputCache
directive:
<%@ OutputCache Duration="60" VaryByParam="None"
VaryByHeader="Accept-Encoding" %>
Upvotes: 4