Slava 111
Slava 111

Reputation: 47

fetching the page failed because it's denied by robots.txt

I'm trying to validate my page with twitter tags in twitter card validator. But all times i get error: "ERROR: Fetching the page failed because it's denied by robots.txt."
If i try to watch on my robots.txt: 'http://domain.subdomain.com/robots.txt' it looks like:

User-Agent: Twitterbot
 Disallow:

I have tried to change it to:

User-Agent: *
 Disallow:

But it did not help. Where im not right?

Upvotes: 2

Views: 4413

Answers (1)

Liron
Liron

Reputation: 555

I had the same issue and after digging into it, I understood that the TwitterBot follows the internal redirects and they also should not be blocked.

I've used the following site robots.txt Validator and Testing Tool to see what's going on. If you mark the Check Resources checkbox it will follow all redirects.

You can validate your Twitter card here: Card Validator

Upvotes: 4

Related Questions