Kevin Tad
Kevin Tad

Reputation: 79

Not understanding this robots.txt

Another company has set up the robots.txt for a site I manage. This is the code they used:

User-agent: googlebot
User-agent: google
User-agent: bingbot
User-agent: bing
Allow: /products/

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Disallow: /sales/
Disallow: /products/
Allow: /wp-content/uploads/
Allow: /wp-content/themes/
Allow: /wp-admin/admin-ajax.php*

I see Disallow: /products/ and allow /products/. I don't understand why they wrote it down like this. Should I change anything?

Upvotes: 0

Views: 51

Answers (1)

Evgeniy
Evgeniy

Reputation: 2605

  1. The first directive, Allow: /products/, is valid for google and bing bots.
  2. The second directive, Disallow: /products/, is valid for all other bots, `User-agent: *´.

I don't see much sense in these rules, but they don't violate standard.

Upvotes: 1

Related Questions