clarkf
clarkf

Reputation: 711

How to disallow multiple folder in robots.txt

I want to disallow robots from crawling any folder/subfolder.

I want to disallow the ff:

http://example.com/staging/
http://example.com/test/

And this is the code inside my robots.txt

User-agent: *
Disallow: /staging/
Disallow: /test/

Is this right? and will it work?

Upvotes: 5

Views: 8119

Answers (1)

Olavo Mello
Olavo Mello

Reputation: 583

Yes, it is right ! You have to add the command Disallow line by line to each path.

Like this:

User-agent: *
Disallow: /cgi-bin/
Disallow: /img/
Disallow: /docs/

A good trick is to use some Robot.txt Generator. Another tip is test your Robot.txt using this Google Tool

Upvotes: 17

Related Questions