Adam
Adam

Reputation: 6122

robots.txt exclude urls which contain specific part

I have urls like this:

Since these urls no longer exist, I want to disallow googlebot via robots.txt to access urls that contain '/share-weddingvenue/'

Will this work to achieve that?

User-agent: *
Disallow: */share-weddingvenue/*

Upvotes: 0

Views: 339

Answers (2)

MarcelL
MarcelL

Reputation: 53

Since only few regular expressions are allowed here, the code would be:

Disallow: weddingvenues/share-weddingvenue/

More on this topic can be found here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449

Upvotes: 1

mata
mata

Reputation: 69032

No, probably it won't work. Here you'll find a good overview on what you can do in a robots.txt:

Note also that globbing and regular expression are not supported in either the User-agent or Disallow lines.

If the content is gone, you should better make sure to return a Status 410 error instead.

Upvotes: 1

Related Questions