Šime Vidas
Šime Vidas

Reputation: 185933

JavaScript links and SEO?

First, take a look at this demo page: http://vidasp.net/tinydemos/seo-javascript-links.html

There is a menu on the page, and clicking on a menu item will display various links to other web-pages (that are part of the web-site). The link URLs are in this format:

www . foo . com / articles / XXX / descriptive-title-of-the-article

... where XXX is a three-digit ID of the given article.

This all seems OK, but there is one issue: all those links are created dynamically via JavaScript. Take a look at the source-code - at the bottom of the page there is a JavaScript variable (the db variable) which holds all the data which is used to generate the links.

I am using JavaScript because I don't want to use the server-side. I assume, in that case I would have to store the data inside a SQL database, and then use C#/PHP/etc. to generate the links. However, this is not an option for me - I am oriented strictly towards the client-side.

BTW, if you want to see a more elaborate demonstration of JavaScript-generated links, go here - http://www.w3viewer.com - there are ~400 links on that page, all of which are generated dynamically via JavaScript.

The question:

Now, I like this approach - using JavaScript to generate links - however, a consequence of this approach is that search-engine crawlers won't register any of those links - they just "see" an empty page with no links (which is a SEO disaster, I assume).

So, I was wondering, how could I optimize this approach?

Update (follow-up question):

Couldn't I use a Google sitemap, to tell the Google crawler which web-pages exist on the web-site? That way I could keep the front-page (the demo above) as it is (with no static links), and the crawler would use the sitemap to crawl all the web-pages of my web-site.

I don't know anything about Google sitemaps yet, but I am wondering why no one suggested them. Could they be a solution to my issue?

Upvotes: 6

Views: 3617

Answers (6)

aiternal
aiternal

Reputation: 1100

Using sitemap would help your pages to be able to crawl by Google but Google ranks you up by page title+content. Also if you use permalinks (you are already using) and page title also exists as h1 tag inside body that would be great.

You would better to put some content as html inside body. You should enrich page functionality with javascript. However Google pages are fully javascript it doesn't like javascript as well. It is the ruler and until it identifies javascript content we all should adopt the rules.

If you would add a sitemap you may use below script.

To top:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd" xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

Repeated url lists. Date with time zone, priority 0 to 1, default is 0.5:

<url>
<loc>page url</loc>
<lastmod>2011-02-06T03:13:29+02:00</lastmod>
<changefreq>monthly</changefreq>
<priority>0.7</priority>
</url>

Ending:

</urlset>

Upvotes: 0

TheAlbear
TheAlbear

Reputation: 5585

Just a quick thing to note, if you include links in your site map which cant be got at via crawling you site you will be marked down within the search engines.

As these pages are seen as doorway pages which are against the t&c of most major search engines, also with no referring URL's they will get a very low score and even if they do get indexed, they wont rank very well.

Upvotes: 1

Dave Ward
Dave Ward

Reputation: 60580

If you're avoiding the server-side because you prefer JavaScript to those other languages, you could always use node.js on the server. There's already a jQuery Templates view engine for node.js that works with Express, so you can even use the same template on client or server.

Unrelated: You shouldn't use the "latest" reference to jQuery on the CDN (i.e. 1.4 vs 1.4.4). Those requests are served with a very short expires header, which is a big performance disadvantage. At that point, it's faster for return visitors if you just use a self-hosted copy.

Upvotes: 1

Sinan &#220;n&#252;r
Sinan &#220;n&#252;r

Reputation: 118128

It seems like what you really need to do is to generate the HTML using templates before deployment using something like Template::Toolkit's ttree. Then, you can keep your database on your development machine. No need for JavaScript.

Here is a simplified example:

[%- 
db = {
    Foo => [
        { id => "001", title => "First article" },
        { id => "002", title => "Another article" },
        { id => "003", title => "Yet another article" },
    ], 
    Bar => [
        { id => "004", title => "First article in this category" },
        { id => "005", title => "Another article in bar" },
        { id => "006", title => "Third bar article" },
    ],
    Baz => [
        { id => "007", title => "Baz article No. 1" },
        { id => "008", title => "The second Baz article" },
        { id => "009", title => "The last article" },
    ],
}
-%]

[%- FOR category IN db.keys -%]

<h2>[%- category -%]</h2>

[%- articles = db.$category -%]

[%- FOR article IN articles -%]

<p>Article: <a href="http://www.example.com/articles/[%- article.id -%]/">
    [%- article.title -%]</a></p>

[%- END -%]
[%- END -%]
C:\Temp> tpage t.html
<h2>Bar</h2>

<p>Article: <a href="http://www.example.com/articles/004">First article in this
category</a></p>

<p>Article: <a href="http://www.example.com/articles/005">Another article in bar
</a></p>

<p>Article: <a href="http://www.example.com/articles/006">Third bar article</a><
/p>

<h2>Baz</h2>

<p>Article: <a href="http://www.example.com/articles/007">Baz article No. 1</a><
/p> 

Upvotes: 3

cyber-guard
cyber-guard

Reputation: 1846

You can use <noscript> All your anachor links here</noscript> which means that the crawlers and users with javascript off will see the links too. You should never forget about users without javascript, and base functionality of a page solely on javascript (without providing noscript alternative), which will also benefit you in SEO sense.

Upvotes: 0

Luca Rocchi
Luca Rocchi

Reputation: 6464

use both js and href ... the trick is simply to have to the site works with href that is what google bot will see at the same time js click handler will be used if browser support it.

of course return false from the handler also stops the href.

Upvotes: 0

Related Questions