Frank
Frank

Reputation: 2173

Command to retrieve all informations from a website

is there a unix command to retrieve all informations possible from a website?

I mean info like: IP, IP geo location, (sub-)domains, alternative domain names, name server, and all other informations I'm thinking about.

I know about whois, but is there anything else? Something that gives more informations?

Thanks

Upvotes: 0

Views: 537

Answers (1)

weletonne
weletonne

Reputation: 489

I don't know any command that can do all of that at once but a simple pipeline should work too.

  1. ping www.website.com for IP
  2. curl ipinfo.io/ip-adress for geo-location
  3. nslookup -query=soa www.website.com for original DNS

Alternatively you can use the command dig to find the subdomains via the DNS:

  1. dig domain.com the output in the authority section are the DNS servers which are used
  2. dig @dns.server domain.com AFXR to retrieve the subdomains of domain.com

Upvotes: 2

Related Questions