Nash
Nash

Reputation: 375

Prevent googlebot from crawling woocommerce filters

Google bot is crawling product filter parameters like following:

/shop/?filter_size=10

/shop/?filter_color=red

/shop/?filter_color=blue?filter_size=20

I tried to add following rules in robots.txt file but i still can see that google is still crawling those kind of urls with filters

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Disallow: /?s=
Disallow: /search/
Disallow: /wp-json/
Disallow: /cart/
Disallow: /wishlist/
Disallow: /checkout/
Disallow: /my-account/
Disallow: *?filter_color=* 
Disallow: *?filter_size=* 
Disallow: *?min_price=* 
Disallow: *?max_price=*
Disallow: /*add-to-cart=*

I am using WooCommerce & Yoast plugin.

I enabled in Yoast only indexing of pages/products/product_cat.

This is causing a high cpu loads on the server as well they are irrelevant for crawling..

How to prevent google from crawling the filters of the shop?..

Upvotes: 2

Views: 2090

Answers (1)

Nash
Nash

Reputation: 375

so after looking around the best way to do it is with blocking the shop page from getting crawled.

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

User-agent: Googlebot
Disallow: /shop
Disallow: /?s=
Disallow: /search
Disallow: /wp-json
Disallow: /cart
Disallow: /wishlist
Disallow: /checkout
Disallow: /my-account
Disallow: /*?*

Upvotes: 2

Related Questions