serge
serge

Reputation: 15229

Use `robots.txt` in a multilingual site

Having to manage a multilingual site, where users are redirected to a local version of site, like

myBurger.com/en // for users from US, UK, etc..
myBurger.com/fr // for users from France, Swiss, etc...

How should be organized the robots.txt file in pair with the sitemap?

myBurger.com/robots.txt // with - Sitemap: http://myBurger.com/??/sitemap
OR
myBurger.com/en/robots.txt  // with - Sitemap: http://myBurger.com/en/sitemap
myBurger.com/fr/robots.txt  // with - Sitemap: http://myBurger.com/fr/sitemap

kwnowing that en and fr sites are in fact independent entities not sharing common content, even if similar appearance.

Upvotes: 1

Views: 3705

Answers (2)

Put the robots.txt at the root: myBurger.com/robots.txt and register your sitemaps in the robots.txt file using the sitemap: directive (see an example I maintain if necessary).

Upvotes: 0

user29671
user29671

Reputation: 782

You need to put one robots.txt at the top level.

The robots.txt file must be in the top-level directory of the host, accessible though the appropriate protocol and port number.

https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt

Upvotes: 4

Related Questions