Reputation: 20115
I have an ajax heavy website. I update the hash values in the address bar so that surfing history is stored - thus the forward and back buttons still function. For example, a typical use case would be:
I believe these hash values are ignored by search engine crawlers. All links with the same path before the hash are assimilated. This would be bad for SEO, because a specific page can not be indexed. For example, I wouldn't be able to search for "site.com sports" on Google and expect to find a link to site.com/directory#sports/1. So how do I both retain ajax history and have good SEO? As far as I know, hashes must be used to not reload the page during ajax. You cannot update the URL like so when doing ajax:
Upvotes: 0
Views: 623
Reputation: 50109
If you want to care for SEO and accessibility, than you should use real url's as links and add AJAX functionality by registering event listeners (like onclick).
In this way, google will see content through the links, and people with JS disabled too. The rest of your users will have full experience with dynamic content and AJAX.
Also read Google's AJAX recommendations
Upvotes: 0
Reputation: 12269
Search engines normally needs a way to find those links without using ajax. If you provide a way to crawl links that replicate your ajax pages (site.com/directory#movies/2) then search engines can pick up your data.
You can do this by creating a navigation page with links to those pages, or by creating a sitemap.xml for your site that explains how to reach those pages.
Just make sure your site renders what site.com/directory#movies/2 should be when it's hit directly and not just site.com/directory.
Upvotes: 1
Reputation: 10570
You need a hash bang: #!. Read Google's Making AJAX Applications Crawlable.
Upvotes: 1