Reputation: 3213
For example, I have an action which responds to an json ajax request on my site.
Recently I found a lot of errors caused by search engine bot request to this endpoint with html
request.
What's the best practice for dealing with this? Should I just respond with error format or I should create a page just because that search engine bot would crawl? And I am using Rails, specific recommendations would be nicer.
Thanks a lot.
Upvotes: 1
Views: 80
Reputation: 3243
The best practice here is to deal with them with your proxy server (e.g. nginx
). What you can do here:
robots.txt
file within your public
directory and create appropriate rules (see here). However, since they're just rules, they do not have to be obeyed by any of them.nginx
rule to reject requests looking like bots using $http_user_agent
, e.g: Blocking all bots except a few with NginxUpvotes: 1