Reputation: 661
I have a site at say www.example.com and also have two staging platforms at beta.example.com and preview.example.com and I need a way to set different robots.txt for each using IIS or something similar
The reason for this is that I want to disallow spiders on all but the www domain as they are spidering duplicated content
Anyone know if this is possible?
Upvotes: 1
Views: 798
Reputation: 4392
Is there a reason you can't just have 3 different robots.txt files, one for each host?
If you need to handle this automatically, I would suggest a HTTPHandler
that handles requests for robots.txt. If the Request.Url.Host
is www.example.com, return robots.allow.txt, if not, return robots.deny.txt.
If you are interested in this idea and need some example code to get you started, let me know.
Upvotes: 1