grayspace
grayspace

Reputation: 32

How to customize DNN robots.txt to allow a module specific sitemap to be crawled by search engines?

I am using the EasyDNN News module for the blog, news articles, etc. on our DNN website. The core DNN sitemap does not include the articles generated by this module, but the module creates its own sitemap.

For example: domain.com/blog/mid/1005/ctl/sitemap

When I try to submit this sitemap to Google, it says my Robots.txt file is blocking it.

Looking at the Robots.txt file that ships with DNN, I noticed the following lines under the Slurp and Googlebot user-agents:

Disallow: /*/ctl/       # Slurp permits *
Disallow: /*/ctl/       # Googlebot permits *

I'd like to submit the module's sitemap, but I'd like to know why the /ctl is disallowed for these user-agents, and what would the impact be if I just removed these lines from the file? Specifically, as it pertains to Google crawling the site.

As an added reference, I have read the article below about avoiding a duplicate content penalty by disallowing specific urls that contain /ctl such as login, register, terms, etc. I'm wondering if this is why DNN just disallowed any url with /ctl.

http://www.codeproject.com/Articles/18151/DotNetNuke-Search-Engine-Optimization-Part-Remov

Upvotes: 0

Views: 881

Answers (1)

Chris Hammond
Chris Hammond

Reputation: 8963

The proper way to do this would be to use the DNN Sitemap provider, something that is pretty darn easy to do as a module developer.

I don't have a blog post/tutorial on it, but I do have sample code which can be found in

http://dnnsimplearticle.codeplex.com/SourceControl/latest#cs/Providers/Sitemap/Sitemap.cs

This will allow custom modules to add their own information to the DNN Sitemap.

The reason /CTL is disallowed is because the normal way to load the Login/Registration/Profile controls is to do site?ctl=login and that is typically not something that people want to have indexed.

The other option is just edit the robots.txt file.

Upvotes: 0

Related Questions