Zerium
Zerium

Reputation: 17333

Can I use robots.txt to block certain URL parameters?

Before you tell me 'what have you tried', and 'test this yourself', I would like to note that robots.txt updates awfully slow for my siteany site on search engines, so if you could provide theoretical experience, that would be appreciated.

For example, is it possible to allow:

http://www.example.com

And block:

http://www.example.com/?foo=foo

I'm not very sure.

Help?

Upvotes: 6

Views: 5230

Answers (1)

Sean Dawson
Sean Dawson

Reputation: 5786

According to Wikipedia, "The robots.txt patterns are matched by simple substring comparisons" and as the GET string is a URL you should be able to just add:

Disallow: /?foo=foo

or something more fancy like

Disallow: /*?* 

to disable all get strings. The asterisk is a wildcard symbol so it matches one or many characters of anything.

Example of a robots.txt with dynamic urls.

Upvotes: 7

Related Questions