Reputation: 1
this is my first question here but I am using this forum for a lot of time to solve a lot of problems that I had during the time , but now I cant find any information to help me in this problem.
I have several cobrands that have affiliate content so I cant control at all the content on that as they are white labels and I only set the cname on them. Because now google looks like is considering cobrands a whole of the root page.
I need to deindex this cobrands to eliminate low content report; I am trying to contact the affiliate managers but .... they are very slow(some of them).
I am looking for a way to disallow in robots.txt this sites : http://cobrand.domain.com ...
any help much appreciate it. Txs
Upvotes: 0
Views: 66
Reputation: 96577
If you want to disallow crawling of URLs whose host is http://cobrand.example.com/
, you need to place a text file robots.txt
at the document root of that host: http://cobrand.example.com/robots.txt
The following content would block everything for every polite bot:
User-agent: *
Disallow: /
Upvotes: 2