weexpectedTHIS
weexpectedTHIS

Reputation: 3376

Google Webmaster Tools claiming my robots.txt blocking almost all of my site

I have submitted a sitemap which has many thousands of URLs, but when I look at the webmaster tools it claims that 9800 of my URLS are blocked by my robots.txt file.

What am I supposed to do to convince it that nothing is being blocked?

Blocking my URL

Blank Robots.txt

Upvotes: -1

Views: 118

Answers (1)

John Mueller
John Mueller

Reputation: 1481

Sometimes this just means that the robots.txt file couldn't be reached (returned a 5xx server error, or Googlebot just didn't get a response). In those cases, Google will treat any URL they attempt to crawl as being disallowed by robots.txt. You can see that in the Crawl Errors section in Webmaster Tools (in the site-errors on top).

Upvotes: -1

Related Questions