Zach
Zach

Reputation: 161

How to get certain pages to not be indexed by search engines?

I did:

<meta name="robots" content="none"/>

Is that the best way to go about it, or is there a better way?

Upvotes: 0

Views: 72

Answers (2)

princeconcord
princeconcord

Reputation: 126

You can create a file called robots.txt in the root of your site. The format is this:

User-agent: (user agent string to match)
Disallow: (URL here)
Disallow: (other URL here)
...

Example:

User-agent: *
Disallow: /

Will make (the nice) robots not index anything on your site. Of course, some robots will completely ignore robots.txt, but for the ones that honor it, it's your best bet.

If you'd like more information on the robots.txt file, please see http://www.robotstxt.org/

Upvotes: 1

Victor
Victor

Reputation: 4721

You could use a robots.txt file to direct search engines which pages not to index.

http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156449

Or use the meta noindex tag in each page.

Upvotes: 0

Related Questions