aDoN
aDoN

Reputation: 1951

Get all urls indexed to google from a website

I want a program that does that, from a website, get all the urls indexed to it with a good output, like, all the urls line by line, and get the urls not used in the website (because a spider can already do that).

I have been searching and finding sloopy options, what I want is accurate and simple: INPUT: URL OUTPUT: ALL THE URLS.

Upvotes: -1

Views: 1732

Answers (2)

user1170117
user1170117

Reputation: 153

This is old but it shows a number one on google. If you need to find all the urls in google just type

site:domain.com

Upvotes: 1

Oleg
Oleg

Reputation: 823

I don't know such applications for now, but I'll try to simplify your task by dividing it:

  1. Yon need a list of your website's internal links. Any webcrawler tool can do that.
  2. You need a list of your website's pages indexed by Google. There are a lot of SE index checkers, you can google it.
  3. Compare 2nd list to the 1st one, and find all the links presents in Google's index but missing on your website.

Upvotes: 1

Related Questions