Reputation: 2413
I work on a site that has some heavy javascript pages. I'm putting url templates into the javascript on the page, for the page's js to use when posting info back to the server. For example:
var someUrlTemplate = '/widget/-1/edit';
// and later
$.get(someUrlTemplate.replace(/-1/, widgetId), ...);
The googlebot is trying to follow 'widget/-1/edit'. I don't want it to, because the link is obviously a dead end.
I know others must have faced a similar issue, and I'm wondering what kind of solutions people have come up with. I've read about trying to use html comments or CDATA inside comments around javascript blocks. I've read about breaking up the url string into concatenated chunks or other methods of obfuscation. But I've found nothing on the interweb that seems like a definitive best practice.
Upvotes: 0
Views: 439
Reputation: 2724
Actually search engines can and will pull down external javascript files. If you don't want the search engines to grab any of your javascript files, you can place them in a directory and disallow the whole directory with a robots.txt file.
Usually the best practices is to put your javascript, css, and static assets on a separate subdomain (CNAME). Then you can just put a robots.txt at the root that blocks the whole CNAME.
Upvotes: 1
Reputation: 324610
My opinion of a best practice would be to have that URL in an external JS file. To my knowledge no search bot navigates to JS files, and so won't find it there. In fact as much of your JS as possible should be in external files.
Upvotes: 1