Reputation: 4724
I want to reduce the load caused by search engine crawlers by flagging in the the site maps the pages that are already indexed and have not changed.
This will reduce approx 95% of the over 2 million pages that are re-indexed every time on our site.
I found no such option in the google site map docs. I assume the search engine wants to re-index every page even if the site says it has not changed, because it does not blindly trust the site to provide accurate information.
Upvotes: 0
Views: 22
Reputation: 96577
The Sitemaps protocol defines three XML elements that search engines may (not must) use to decide which documents to crawl when or how often:
How frequently the page is likely to change.
The date of last modification of the file.
The priority of this URL relative to other URLs on your site.
(Webmasters SE might be of help in learning if or which search engines support these in which way.)
Upvotes: 1