IVIR3zaM
IVIR3zaM

Reputation: 557

how to request google to recrawl a url (without Fetch As Google)

I'm working on a website with 15,000 visits per day, we have hired ads on that, some of them are important (they are for banks) and the content of these kinds of ads will never change, but at some point they announced that they need some people with repeated content (Jobs and Requirements never changes :| ). Now we need to tell Googlebot to recrawl those URLs, so we updated the content of the target pages with different writing but Googlebot doesn't crawl them as fast as needed. Using Fetch as Google is an opinion but there are a lot of ads and we need an automatic way do that. What is your advice?

Upvotes: 0

Views: 1276

Answers (3)

Unfortunately the way you want to achieve this is not possible, because Google decides how often to recrawl a page based on its popularity, the number of backlinks and how valuable it thinks your page is to its users. You can't force Google to recrawl a page if you don't want to use fetch as Google.

However, there is a workaround. Create a new page with the new content, using a new URL (containing a recent date for example, but the same path otherwise), and have the old page redirect to it with a 301, or set a canonical link from the old page to the new page. This will surely accelerate the indexing of the new page, while removing the old one pretty quickly. It will pass most of the PR juice from the old page to the new page. Make sure the new page is added to your sitemap.xml too.

That will achieve what you want automatically (pretty much).

Upvotes: 1

masmrdrr
masmrdrr

Reputation: 559

Try to avoid automation and submitting links on an ad-hoc basis. Build a clean xml sitemap. An xml file tells search engines that your sitemap may contain dynamism and frequent changes along with a healthy list of URLs. It means that if your Sitemap uses well-formed XML code, supplies clean, valid URLs, and meets the other requirements of the search engines, the URLs it contains will at least be noted for consideration by the search engines for future crawling activity. Patience pays off....

A healthy xml sitemap helps sites that use dynamic URLs for their content pages and sites with archived content that’s not well-linked to its currently active pages. Also xml sitemaps assists sites with hard-to-discover pages that use hard-to-crawl links (such as those in scripts) or that are heavy in non-text content , such as Flash or Silverlight.

There are a lot of solutions online to help generate an XML sitemap. All the best!

Upvotes: 1

Chris Pateman
Chris Pateman

Reputation: 559

When you submit your Sitemap you can specify how often content changes. It's not going to guarantee that it crawls that often but it will let Google know that's how often it needs to crawl.

google help to create Sitemap.

Upvotes: 0

Related Questions