Krishna Deepak
Krishna Deepak

Reputation: 1765

wikipedia servers gzip content

does wikimedia api supports gzip encoding. I'm using curl to get the content. I have used the accept gzip encoding. But it does not seem to work..

curl_setopt($ch,CURLOPT_ENCODING , "gzip");

Upvotes: 0

Views: 187

Answers (3)

Ilmari Karonen
Ilmari Karonen

Reputation: 50338

To answer your literal question: yes, it does.

One way to test this is to install Firebug and visit a MediaWiki API URL with the "Net" tab active. The response headers you'll see should look something like this:

HTTP/1.0 200 OK
Date: Mon, 07 May 2012 23:05:37 GMT
Server: Apache
X-Content-Type-Options: nosniff
Cache-Control: private
MediaWiki-API-Error: help
Content-Encoding: gzip
Vary: Accept-Encoding
Content-Length: 37421
Content-Type: text/html; charset=utf-8
X-Cache: MISS from sq59.wikimedia.org, MISS from amssq35.esams.wikimedia.org, MISS from amssq44.esams.wikimedia.org
X-Cache-Lookup: MISS from sq59.wikimedia.org:3128, MISS from amssq35.esams.wikimedia.org:3128, MISS from amssq44.esams.wikimedia.org:80
Connection: keep-alive

(And no, it's not just because of Wikipedia's front end proxies, either; I just tried the same on my own wiki, which is not behind a proxy, and got the same Content-Encoding header.)

Upvotes: 0

Alexey
Alexey

Reputation: 3484

Have you tried to add 'Accept-Encoding: gzip, deflate' to get request headers? I tried it with this sample:

http://pastebin.com/AmndVB3e

and it sent it gzipped content.

Upvotes: 1

Tarang
Tarang

Reputation: 75945

Try (let me know if it works, I can't tell myself from the rest of your Curl options)

curl_setopt($ch, CURLOPT_ENCODING, "deflate, gzip, identity");

Upvotes: 0

Related Questions