Reputation: 619
I'm developing a website , and I want to host the development work for the client so he can approve , but I don't want the website to be found by search engines.
How can I hide it ?
I heard about adding robot.txt ?
any advice ?
Upvotes: 7
Views: 21744
Reputation: 1
If you don't want web crawlers finding your public links on search engines, add this code on the .htaccess file.
# Block search engine bots
RewriteCond %{HTTP_USER_AGENT} ^.*(bot|crawler|spider|curl|wget).* [NC]
RewriteRule .* - [F,L]
This will tell the server not to let any crawler access your public links. Works immediately.
Upvotes: 0
Reputation: 12176
If you don't put public links to the website, search engines will not easily find it. However, at some point, there is bound to be a link there if many people discuss the site.
Robots.txt will help for all legitimate search engines, such as Google. It will however not help if someone accidentally finds the url or it is left in browser history.
The most sure way is to put a password on the site: http://www.elated.com/articles/password-protecting-your-pages-with-htaccess/
Upvotes: 7
Reputation: 227
I use robot.txt when I need to hide folders or sites in development. http://www.robotstxt.org/robotstxt.html I think it'll work best for what you need.
Upvotes: 4
Reputation: 1749
Yes, you can use robots.txt. Check out this guide http://www.robotstxt.org/robotstxt.html.
To exclude all robots from the entire server.
User-agent: *
Disallow: /
Upvotes: 8