d3bug3r
d3bug3r

Reputation: 2586

Ruby on Rails sitemap

I have a ROR apps that selling items such as chair, table etc. I am using gem sitemap_generator to generate the sitemap. Here is the code of my sitemap:

# Set the host name for URL creation
SitemapGenerator::Sitemap.default_host = "http://www.example.com"

SitemapGenerator::Sitemap.create do
  add '/products', :priority => 0.7, :changefreq => 'daily'
end

When I run the command rake sitemap:refresh a sitemap.xml.gz is created in public folder. My robots.txt as follow:

# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-agent: *
# Disallow: /
Sitemap: http://www.example.com/sitemap.xml.gz

Would this mean, all my products at www.example.com/products will be available for google to index?

Thanks!!

Upvotes: 0

Views: 1148

Answers (1)

Jon
Jon

Reputation: 10918

Firstly, you're better off using the url helpers rather than explicit paths. This way if the paths ever change due to modifications to your routes.rb file, you wont need to worry about your sitemap being wrong:

SitemapGenerator::Sitemap.create do
  add products_url, priority: 0.7, changefreq: 'daily'
end

Next, the products url you've added above will only add /products to your sitemap. You might want to add each individual product, depending on the frequency of them changing:

SitemapGenerator::Sitemap.create do
  add products_path, priority: 0.7, changefreq: 'daily'

  Product.all.each do |product|
    add product_path(product), priority: 0.7, changefreq: 'daily'
  end
end

Obviously, you will need to trigger a sitemap refresh each time you add/remove a product.

Upvotes: 1

Related Questions