Lakshmikantha raju
Lakshmikantha raju

Reputation: 59

Disable crawling for subdomain

I want to disable crawling for my subdomains.

For example: my main domain is maindomain.com
subdomain_one.com (add-on domain)
subdomain_two.com (add-on domain)

So I want to disable crawling for subdomain_one.maildomain.com.

I have used this in robot.txt:

   User-agent: *
   Disallow: /subdomain_one/
   Disallow: /subdomain_two/

Upvotes: 2

Views: 4232

Answers (1)

unor
unor

Reputation: 96607

The file must be called robots.txt, not robot.txt.

If you want to disallow all bots to crawl your subdomain, you have to place a robots.txt file in the document root of this subdomain, with the following content:

User-agent: *
Disallow: /

Each host needs its own robots.txt. You can’t specify subdomains inside of the robots.txt, only beginnings of URL paths.

So if you want to block all files on http://sub.example.com/, the robots.txt must be accessible from http://sub.example.com/robots.txt.

It doesn’t matter how your sites are organized on the server-side, it only matters what is publicly accessible.

Upvotes: 7

Related Questions