Reputation: 8594
I have the following robots.txt
User-Agent: *
Disallow:
User-Agent: Googlebot
Allow: /
to disallow all Bots except Google's. I made this change last week and when I search for my domain name in Google I still get A description for this result is not available because of this site's robots.txt
. Am I doing something wrong? How often does Google respider a domain
Upvotes: 0
Views: 135
Reputation: 96567
Your robots.txt is not doing what you want (but that’s not related to the problem you mention).
If you want to disallow crawling for every bot except "googlebot", you want to use this robots.txt:
User-agent: googlebot
Disallow:
User-agent: *
Disallow: /
Disallow: /
means: disallow every URL
Disallow:
means: disallow nothing (i.e., allow everything)
To your problem: There is no definite answer to how often Google crawls. It depends on different, for us non-calculable, factors. Having to wait a week or two is not unusual.
Upvotes: 1