serhio
serhio

Reputation: 28586

Add a robots.txt per site in Umbraco

I use a multisite multilanguage Umbraco CMS. I have something like this:

first.com
second.com
third.com/en
third.com/fr

Is there a way to add/manage the robots.txt individually for each site...

first.com/robots.txt
second.com/robots.txt
third.com/en/robots.txt
third.com/fr/robots.txt

in the second instance, the sitemap should behave the same way...

(I have the the umbraco v4.7.2)

Using packages like "Cultiv.DynamicRobots" allows only one (even dynamic) robots "per umbraco" installation... and does not permit third.com/fr/robots.txt

Upvotes: 4

Views: 1059

Answers (2)

serhio
serhio

Reputation: 28586

As Eyescream pointed out, there could be a solution to redirect to a page of a site... But with multiple sites and languages that could be some work to do for each one... Another one with same idea I used is having to modify the DynamicRobots component (here is the source code) to read also a QueryString parameter, say "ln" of the request.

Then added a redirect from "mysite.com/fr/robots.txt" to "mysite.com/robots.txt?ln=fr", recuperate Request.QueryString and adding to the site host... {HTTP_HOST} + QueryString["ln"]

Upvotes: 1

Eyescream
Eyescream

Reputation: 911

You can use a url rewrite rule to catch the call to robots.txt and to redirect the request to your own page with the proper logic inside

Upvotes: 5

Related Questions