Reputation: 10891
I have a Drupal site that's using the Domain Access module to host multiple sites from a single Drupal installation, therefore there is only 1 htaccess file and one robots.txt file for the site.
I don't want a few of the sub-sites crawled by search engines. From what I can understand about robots.txt it can't be used for this kind of thing, so I was hoping to block the search engine bots IF they're trying to access specific domains on my server.
I found this htaccess snippet for blocking the bots, but how can I add logic to have it run only when a specific domain is being accessed?
RewriteCond %{HTTP_USER_AGENT} (googlebot|bingbot|Baiduspider) [NC]
RewriteRule .* - [R=403,L]
Upvotes: 1
Views: 70
Reputation: 785266
You can add one more RewriteCond
in this rule to block only when HOST_NAME
in request is sub.domain.com
:
RewriteCond %{HTTP_USER_AGENT} (googlebot|bingbot|Baiduspider) [NC]
RewriteCond %{HTTP_HOST} ^sub\.domain\.com$ [NC]
RewriteRule ^ - [F]
Upvotes: 3