RedhopIT
RedhopIT

Reputation: 619

How to hide a website from search engines

I'm developing a website , and I want to host the development work for the client so he can approve , but I don't want the website to be found by search engines.

How can I hide it ?

I heard about adding robot.txt ?

any advice ?

Upvotes: 7

Views: 21744

Answers (6)

Frankoholix
Frankoholix

Reputation: 1

If you don't want web crawlers finding your public links on search engines, add this code on the .htaccess file.

# Block search engine bots
RewriteCond %{HTTP_USER_AGENT} ^.*(bot|crawler|spider|curl|wget).* [NC]
RewriteRule .* - [F,L]

This will tell the server not to let any crawler access your public links. Works immediately.

Upvotes: 0

Noor  Uz zama
Noor Uz zama

Reputation: 1

also you can use meta tag with noindex nofollow

Upvotes: 0

jpa
jpa

Reputation: 12176

If you don't put public links to the website, search engines will not easily find it. However, at some point, there is bound to be a link there if many people discuss the site.

Robots.txt will help for all legitimate search engines, such as Google. It will however not help if someone accidentally finds the url or it is left in browser history.

The most sure way is to put a password on the site: http://www.elated.com/articles/password-protecting-your-pages-with-htaccess/

Upvotes: 7

Andy McCormick
Andy McCormick

Reputation: 227

I use robot.txt when I need to hide folders or sites in development. http://www.robotstxt.org/robotstxt.html I think it'll work best for what you need.

Upvotes: 4

mrswadge
mrswadge

Reputation: 1749

Yes, you can use robots.txt. Check out this guide http://www.robotstxt.org/robotstxt.html.

To exclude all robots from the entire server.

User-agent: *
Disallow: /

Upvotes: 8

woz
woz

Reputation: 11004

You can find it on Wikipedia:

User-agent: *
Disallow: /

Put that in your robot.txt file, which goes in the top-level directory of your web server as described here.

Upvotes: 11

Related Questions