Reputation: 11
Can anyone guide me how to create robots.TXT file for the following URLs or directory:
Original URL which I want to do index in search engines:
book2park.com/
book2park.com/locations.php
But the following URLs (almost all the pages) I found in Google database which I want to disallow permanently from all the search engines:
lawnchair.book2park.com/
lawnchair.book2park.com/locations.php
Basically "Lawnchair" comes before the start of every URL.
Upvotes: 1
Views: 764
Reputation: 1752
A given robots.txt file applies only to the exact subdomain it was loaded from. In other words, the following robots.txt file:
http://sub1.example.com/robots.txt
can only control crawling of:
http://sub1.example.com/...
It can not control crawling of:
http://example.com/...
http://sub2.example.com/...
http://sub.sub1.example.com/...
The solution is to add a separate robots.txt file for each subdomain. So, on http://lawnchair.book2park.com/robots.txt you could block everything:
User-agent: *
Disallow: /
and on http://book2park.com/robots.txt you could allow everything:
User-agent: *
Disallow:
(or you could just not have a robots.txt file on the main domain)
Upvotes: 2