Alex
Alex

Reputation: 66012

ActionView::MissingTemplate Error, Only When Visited By A Bot?

I have an action that serves my homepage. It works fine when visited normally (ie by a user in a web browser), but when visited by specific web crawlers, it throws the following error:

 A ActionView::MissingTemplate occurred in tags#promoted:

 Missing template tags/promoted with {:handlers=>[:erb, :rjs, :builder, :rhtml, :rxml], :formats=>["text/*"], :locale=>[:en, :en]} in view paths "/Apps/accounts/app/views", "/usr/local/rvm/gems/ruby-1.9.2-p180@accounts/gems/devise-1.3.0/app/views"
 actionpack (3.0.4) lib/action_view/paths.rb:15:in `find'

It appears the bots are trying to fetch the text/* format, which there is no template for, which makes sense, so I tried to do the following in my action:

  def promoted
   request.format = :html #force html to avoid causing missing template errors
   # more action stuff....
  end

In essence, I am trying to force the request's format to html so it serves the html template.

Yet every time these set of bots request this page, the missing template error occurs.

It's not that big of deal, but ideally I'd like to resolve this error, if only so I stop getting these error emails from my app.

Is the only way to make a file called my_action.text.erb and put some gibberish in it? Or can I solve this more elegantly?

Upvotes: 15

Views: 1557

Answers (2)

Jeff Wigal
Jeff Wigal

Reputation: 708

I've been seeing these as well. You could use some middleware to rewrite these requests:

class Bot
  def initialize(app)
    @app = app
  end

  def call(env)
    h = env["HTTP_ACCEPT"]
    env["HTTP_ACCEPT"] = "text/html" if h == "text/*"
    @app.call(env)
  end
end

I forked a gem for killing off some MS Office Discovery Requests, and it seemed to make sense to add this middleware into it.

https://github.com/jwigal/rack-options-request

Upvotes: 7

Alex
Alex

Reputation: 66012

It turns out this specific set of bots are as dumb as a rock, and ignore any sort of request formatting as I was trying to do. I ended up disallowing these bots' User-Agents in my robots.txt. No more errors. However, if somebody has a more elegant solution, please post it and I'll mark it as the accepted answer, otherwise, I'll accept this one in a couple of days.

Upvotes: 0

Related Questions