victor.ja
victor.ja

Reputation: 888

Disallow only homepage ( / ) and allow all other pages for robots.txt

I need to prevent Google web crawler from crawling only my homepage, located in /

But I need to allow all the other pages to be crawled. How can I achieve that?

I tried doing:

User-agent: *
Disallow: /

User-agent: *
Disallow:

But it's not working

Upvotes: 0

Views: 201

Answers (1)

Tobias Schwarz
Tobias Schwarz

Reputation: 243

You need to use the following for this:

User-agent: *
Disallow: /$

The path of the URLs is compared against the Disallow directives. $ designates the end of the match pattern so the Disallow directive will only match https://example.com/ but not https://example.com/foo.

Upvotes: 2

Related Questions