Reputation: 342
I have two domains in the same server and I want to disallow search engines in one of these:
www.example.com
-> pointing to root
www.example.net
-> pointing to directory
I added a robots.txt in a directory of .net
for disallow all for www.example.net
like this:
User-agent: *
Disallow: /
I'm doing the right thing? Am I disallowing the search engines only in .net
?
Upvotes: 1
Views: 213
Reputation: 4561
As can be seen on this page, yes, you're doing exactly the right thing:
User-agent: *
Disallow: /
If the two websites are located in the same directory, there is no way as far as I'm aware of disallowing robots.txt just for one site.
However, as the same page also states, this only works with well behaved robots
, meaning that some search engines could dismiss the robots.txt entirely.
If you really need the .net
to be secured from search-engines, the one foolproof way of doing this is by adding a username and password protection in .htaccess (any user accessing the .net
will also have to have the username/password):
AuthType Basic
AuthName "Password Protected Area"
AuthUserFile /path/to/.htpasswd
Require valid-user
.htpasswd:
[user]:[password]
Upvotes: 1