Reputation: 3455
There's a way of excluding complete page(s) from google's indexing. But is there a way to specifically exclude certain part(s) of a web page from google's crawling? For example, exclude the side-bar which usually contains unrelated contents?
Upvotes: 10
Views: 5710
Reputation: 2584
Google used to have <!--googleoff: all-->
/ <!--googleon: all-->
tags that you could wrap around content, but it's gone from their documentation. It might have only been for sites using Google's Search Appliance (RIP).
The Wikipedia noindex page lists a bunch of other comments, tags, and attributes that search engines used to respect to not index parts of a page. Search Engine Optimization is such a war now that search engines just decide on their own what to index.
Upvotes: 0
Reputation: 682
You can include with an IFRAME tag the part of the page that you want hide at Googlebot and block the indexing of the file included from the robots.txt file.
add the iframe for include the side-bar in your page
<iframe src ="sidebar.asp" width="100%" height="300">
</iframe>
here the rules to be added in the robots.txt file for block the spider
user-agent: *
disallow: sidebar.asp
Upvotes: 5
Reputation: 52533
If you're doing this for AdSense, here's an article on how to exclude content from the scraper. If you don't want Google to follow links, you can give them a rel="nofollow"
attribute. Otherwise, I'm afraid you may be out of luck here.
Something else you could do, but I wouldn't necessarily recommend doing, is detecting the user agent before rendering your page, and if it's a spider or bot, not showing the portions of your page you want to exclude.
Upvotes: 1