Reputation: 3489
I'm reading up on the Sitemaps.org protocol(http://www.sitemaps.org/protocol.html) so I can create my own. However one question comes to mind: How do I deal with subpages like, for example, http://www.example.com/page/subpage?
Can i just use it as a 'toplevel' <url>
tag or do I make a <url>
in a <url>
tag? or do I use a <urlset>
for each toplevel page?
And with that, can i put a *.php file who converts to a XML file in my robots.txt for Google to find. Or am I obligated to use a *.xml file?
Thanks in advance,
Upvotes: 1
Views: 1460
Reputation: 96737
No, don't nest url
in url
.
Each URL gets its own url
element. They are all direct childrens of the urlset
element.
Bots don't necessarily assume (or understand) a "hierarchy" (/page/subpage/subsub/…
) in URLs. They use it as a unique string; it doesn't matter to them if your page about soup recipes is at /recipes/soups
or at /soups
or at /what-i-like
(of course, there are other reasons/use-cases why hierarchical URLs might be a good idea).
And with that, can i put a *.php file who converts to a XML file in my robots.txt for Google to find. Or am I obligated to use a *.xml file?
It doesn't matter how you create the XML file, it only matters if it gets delivered as XML (e.g. with Content-type application/xml
). The extension (.xml
, .php
, …, none at all) shouldn't matter.
Note that you can also use RSS 2.0, Atom, or even plain text to create your sitemap.
Upvotes: 1