hjuster
hjuster

Reputation: 4070

How to crawl jquery powered websites?

I am building a single page javascript powered website. I have all necessary data for all pages echoed using php as json objects on my home page. Then i initialize pages using custom plugins made for each page, which dynamically builds the dom using the relevant json data, which i pass to the plugin, so i don't have any ajax requests. Links on my website are in the following format !#about, !#home, etc... And currently the plugin's init methods are called on hashchange. What should i do to make these pages crawlable by google bots and how to make different title and description meta tags for each of these pages?

I've tried various things i found in google docs and on many other websites. I've changed links from #mylink to #!mylink , so google should interpreted that as get _escape_fragment_ variable, then i've tried to add this chunk of php code:

if ($fragment = $_GET['_escaped_fragment_']) {
    header("Location: Project.php?id=$fragment", 1, 301);
    exit;
}

where project.php is an html snapshot with relevant information, which i want to be crawled. Basically just core information. But as far as i seen nothing happens... :( After all is there a way to achieve this without AJAX requests?

Upvotes: 0

Views: 1996

Answers (2)

HenchHacker
HenchHacker

Reputation: 1626

Google has actually published how to make ajax crawlable - who better to tell you how!?

https://developers.google.com/webmasters/ajax-crawling/

Direct Links From That Page

Alternative Guide

If you find that hard to follow, try this one on SitePoint that runs you through how it's done: http://www.sitepoint.com/google-crawl-index-ajax-applications/

Upvotes: 1

Andrew
Andrew

Reputation: 7768

Well, the only way is to build a sitemap and add links to xml sitemap file to each page; submit your sitemap via google webmaster tools.

Upvotes: 0

Related Questions