Reputation: 43884
I have a website with a search on it with 4 drop downs. Each of these drop downs redirects back to the page in question but with a url parameter to tell it how to sort the results. The first drop down has 13 options and the other three have 4 options each. Of course Google sees duplicate content and eats my SEO for it.
I have been able to get the duplicated content down a little since the drop down containing the 13 options was a category sorter so I have used that to change the title on each page to help stop the whole duplicated thing, however the other 3 are pure sorters and cannot really be used to change the title of the page (and shouldn't be).
So I have come up with a solution:
The question is should I add a nofollow to the sorter links to stop Google from taking them or can the sitemap just index that url and not follow other links inside of it?
Or is there a better way?
Also as a side note:
If I have a URL in sitemaps.xml like:
/user/view?id=1
But I have a robots.txt line like:
Disallow: /user/view
Can Google still index the sitemap url and is it a good practice to block access to dynamic pages like that?
Thanks,
Upvotes: 2
Views: 332
Reputation: 219804
A better way to handle this would be to use canonical URLs. This will tell Google which page is the "main" page and to include it in its index and that the other pages that are duplicates of it should be considered the same page (and not included in the search results). This prevents you from having to block pages or, worse, use nofollow on internal pages. In fact, in the blog post I linked to the examples they give are almost identical to your use case.
Upvotes: 2