Reputation: 3376
I have submitted a sitemap which has many thousands of URLs, but when I look at the webmaster tools it claims that 9800 of my URLS are blocked by my robots.txt file.
What am I supposed to do to convince it that nothing is being blocked?
Upvotes: -1
Views: 118
Reputation: 1481
Sometimes this just means that the robots.txt file couldn't be reached (returned a 5xx server error, or Googlebot just didn't get a response). In those cases, Google will treat any URL they attempt to crawl as being disallowed by robots.txt. You can see that in the Crawl Errors section in Webmaster Tools (in the site-errors on top).
Upvotes: -1