Meisner
Meisner

Reputation: 785

Is making AJAX site crawlable AND degrading gracefully with JS turned off possible?

According to this spec, making AJAX site crawlable by Googlebot means that you have to use hashbang (#!) links in it which means it won't degrade gracefully when JS is turned off (or progressively enhance when it is turned on). This might mean that crawlability and graceful degradation/progressive enhancement are mutually exclusive in this case. Is it in fact so? Is there something that can be done about that?

Note: To be transparent I'll note that this question was also asked in Pro Webmasters site but I consider it interesting also from a purely programmatic point of view.

Upvotes: 1

Views: 149

Answers (1)

icktoofay
icktoofay

Reputation: 129079

When possible, I like to only use AJAX to load new pages when history.pushState is available. When history.pushState is not available, I fall back to non-AJAX. While this may be a sub-par experience for those without history.pushState, it makes sure the URL is always pointing to the right place and that the site will be accessible to both Google and users with JavaScript disabled.

Upvotes: 4

Related Questions