trinity
trinity

Reputation: 10484

writing a http sniffer

I would like to write a program to extract the URLs of websites visited by a system (an IP address) through packet capture.. I think this URL will come in the data section ( ie not in any of the headers - ethernet / ip / tcp-udp ).. ( Such programs are sometimes referred to as http sniffers , i'm not supposed to use any available tool ). As a beginner , I've just now gone through this basic sniffer program : sniffex.c.. Can anyone please tell me in which direction i should proceed..

Upvotes: 2

Views: 4452

Answers (6)

bobbypavan
bobbypavan

Reputation: 548

I was researching on something similar and came across this. Hope this could be a good start if you are using linux - justniffer.

http://justniffer.sourceforge.net/

There is also a nice http traffic grab python script that would help if you are looking to get information from HTTP requests.

Upvotes: 0

symcbean
symcbean

Reputation: 48357

Have a look at PasTmon. http://pastmon.sourceforge.net

Upvotes: 0

aehiilrs
aehiilrs

Reputation: 1245

Note: In the info below, assume that GET also includes POST and the other HTTP methods too.

It's definitely going to be a lot more work than looking at one packet, but if you capture the entire stream you should be able to get it from the HTTP headers sent out.

Try looking at the Host header if that's provided, and also what is actually requested by the GET. The GET can be either a full URL or just a file name on the server.

Also note that this has nothing to do with getting a domain name from an IP address. If you want the domain name, you have to dig into the data.

Quick example on my machine, from Wireshark:

GET http://www.google.ca HTTP/1.1
Host: www.google.ca
{other headers follow}

Another example, not from a browser, and with only a path in the GET:

GET /ccnet/XmlStatusReport.aspx HTTP/1.1
Host: example.com

In the second example, the actual URL is http://example.com/ccnet/XmlStatusReport.aspx

Upvotes: 4

Paul Tomblin
Paul Tomblin

Reputation: 182782

No, there is not enough information. A single IP can correspond to any number of domain names, and each of those domains could have literally an infinite number of URLs.

However, look at gethostbyaddr(3) to see how to do a reverse dns lookup on the ip to at least get the canonical name for that ip.

Update: as you've edited the question, @aehiilrs has a much better answer.

Upvotes: 4

Boolean
Boolean

Reputation: 14664

If you are using Linux, you can add a filter in iptables to add a new rule which looks for packets containing HTTP get requests and get the url.

So rule will look like this.

For each packet going on port 80 from localhost -> check if the packet contains GET request -> retrieve the url and save it

This approach should work in all cases, even for HTTPS headers.

Upvotes: 0

bmargulies
bmargulies

Reputation: 100042

What you might want is a reverse DNS lookup. Call gethostbyaddr for that.

Upvotes: 0

Related Questions