Spam Tester
Spam Tester

Reputation: 1

URLs restricted by robots.txt errors on google search console

i am creating a wordpress site on subdomain and I am facing this error: Sitemap contains urls which are blocked by robots.txt. on google search console

Upvotes: 0

Views: 169

Answers (1)

Levi
Levi

Reputation: 869

The robots.txt file regulates web robots (typically search engine robots) how to crawl pages on their website, So if a link is blocked by robots.txt then you should go to that file and edit it and remove the link from Disallow or add it to the Allow.

For Example

user-agent : *

Disallow : /wp-admin/ >> This will block the web crawler to crawl the /wp-admin/

Allow : /wp-content/ >> This will Allow the web crawler to crawl the /wp-content/

Here user-agent can be anyone i.e. a user or a web crawler.

See this and try configure the your robots.txt on Moz.com

Upvotes: 0

Related Questions