John Smith
John Smith

Reputation: 630

Duplicate content fixed by robots.txt?

I have the following URLs with the same content:

http://www.mysite.com/forum/viewthread.php?thread_id=39&pid=1349
http://www.mysite.com/forum/viewthread.php?forum_id=2&thread_id=39

Which at the moment is a problem SEO wise.

Can I solve my SEO problem by simply adding this to my robots.txt:
Disallow: /forum/viewthread.php?forum_id=*&

Or is that not going to solve anything?

Upvotes: 1

Views: 81

Answers (1)

Robert
Robert

Reputation: 3074

To answer the question yes you could use the robots.txt to block the URL.

Reference: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449

BUT a more accurate way to accomplish this according to Matt is:

If the content is truly duplicating and there is basically multiple ways to land on that page, I would recommend using canaonical urls. They basically tell the spiders that this page = a common url for multiple pages.

Reference: http://www.mattcutts.com/blog/seo-advice-url-canonicalization/

Upvotes: 1

Related Questions