Tim Wilson
Tim Wilson

Reputation: 169

Need to target/disallow subdomains in robots.txt

Good morning.

So, I have run into a tricky situation. My environment is a 2 server mirrored setup. I have 2 sub domains to target each server specifically when needed. I would like to disallow indexing of the 2 sub domains without affecting the www.

For example, I have sub1.domain.com/sub2.domain.com/www.domain.com. They all point to the same web root directory. Merely saying user agent disallow in robots.txt will not work as it will remove indexing of the www.

Please feel free to ask any questions as needed.

Thanks!

Upvotes: 1

Views: 221

Answers (1)

Jon Lin
Jon Lin

Reputation: 143906

You can place a robots.txt file and name it something like, no-index-robots.txt. You'd just put:

User-agent: *
Disallow: /

in there. Then in the htaccess file in your document root, add this:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^(sub1\.|sub2\.)domain\.com$ [NC]
RewriteRule ^robots\.txt$ /no-index-robots.txt [L]

Upvotes: 2

Related Questions