Dmytro Zarezenko
Dmytro Zarezenko

Reputation: 10686

Common rule in robots.txt

How can I disallow URLs like 1.html, 2.html, ..., [0-9]+.html (in terms of regexp) with robots.txt?

Upvotes: 1

Views: 177

Answers (1)

unor
unor

Reputation: 96687

The original robots.txt specification doesn't support regex/wildcards. However, you could block URLs like these:

  • example.com/1.html
  • example.com/2367123.html
  • example.com/3
  • example.com/4/foo
  • example.com/5/1
  • example.com/6/
  • example.com/7.txt
  • example.com/883
  • example.com/9to5

with:

User-agent: *
Disallow: /0
Disallow: /1
Disallow: /2
Disallow: /3
Disallow: /4
Disallow: /5
Disallow: /6
Disallow: /7
Disallow: /8
Disallow: /9

If you want to block only URLs starting with a single numeral followed by .html, just append .html, like:

User-agent: *
Disallow: /0.html
Disallow: /1.html
…

However, this wouldn't block, for example, example.com/12.html

Upvotes: 1

Related Questions