Reputation: 3020
my robots.txt looks like this:
User-agent: *
Disallow: /admin
Disallow: /test
User-Agent: Googlebot
Disallow: /maps
Now Google ignores the user-agent * part and only obeys the specific Googlebot directives (/maps). Is this normal behaviour and shouldn't also obey the useragent * directives (/admin, /test)?
It seems strange to have to add every line for every useragent?
Upvotes: 5
Views: 1453
Reputation: 3020
Never mind, Google states this:
Each section in the robots.txt file is separate and does not build upon previous sections. For example:
User-agent: * Disallow: /folder1/
User-Agent: Googlebot Disallow: /folder2/
In this example only the URLs matching /folder2/ would be disallowed for Googlebot.
Upvotes: 4