Reputation: 6531
I am developing an asp.net (4.0) web forms application and am using Routing for all my Urls. I'm unsure as to whether I should be putting entries in my robots.txt like this:
Disallow: /forum/editpost.aspx
(Actual path/filename)
OR like this:
Disallow: /forum/edit-post
I'm assuming it's the latter as that's how all my pages are referenced on the web, but thought it safer to check.
Upvotes: 0
Views: 361
Reputation: 4933
Use whatever Googlebot et al. will see when they index the page. Robots.txt is not executed, parsed or processed on your server-side.
Upvotes: 0
Reputation: 5261
Under the assumption that you never reference pages by the physical path and do not want to reference pages by the physical path (since you are using routing), you do not need to put physical pages into the robots file since the crawler will never find them.
As such you just need to disallow the routed paths. If you worry that you've accidentally used a physical path in a link somewhere on your site, I suppose you can disallow physical pages just to be safe (hopefully in one shot).
Upvotes: 2