PeteT
PeteT

Reputation: 19180

Google (Search Engine) Indexing advice for asp.net pages

I am working on a course leaflet system for the college I work at, leaflets are stored in a database with primary key course_code. I would like the leaflets ideally to get indexed by google how would I achieve this assuming i develop the system in asp.net 2.0.

I understand part of getting it indexed is to pass the variables around in the link in my case the course_code, this obviously also allows bookmarking of course leaflets which is nice. What are the specifics of getting the googlebot to trawl the system best.

Upvotes: 4

Views: 1191

Answers (5)

Liam
Liam

Reputation: 20950

Use wget to crawl your site.

wget -r www.example.com

If wget does not reach some of your URLs, then Google is unlikely to reach them.

Upvotes: 0

The Alpha Nerd
The Alpha Nerd

Reputation: 105

Drop a sitemap.xml file - they help lots.

Upvotes: 0

Mitchel Sellers
Mitchel Sellers

Reputation: 63126

One big thing is to use a url-rewriting scheme if you can to avoid urls like

http://www.yoursite.com/default.aspx?course_code=CIS612

But with re-writing you could get it to be something like

http://www.yoursite.com/courses/CIS612.aspx

Those types of things really help as querystrings are not all that perfect.

UrlRewriting.net is a good place to start with rewriters.

Upvotes: 0

Jon Skeet
Jon Skeet

Reputation: 1502086

If the Google bot is able to crawl your page and get everywhere on your site just by following links, without filling in any forms or running any JavaScript, you should be good to go.

(Disclaimer: Although I work for Google, I haven't looked at what the crawler does, and know little beyond public knowledge.)

Upvotes: 0

chills42
chills42

Reputation: 14513

Look into Google's Webmaster tools

Upvotes: 3

Related Questions