larryzhao
larryzhao

Reputation: 3213

Generally what should I do to respond to a search engine bot request with error format

For example, I have an action which responds to an json ajax request on my site.

Recently I found a lot of errors caused by search engine bot request to this endpoint with html request.

What's the best practice for dealing with this? Should I just respond with error format or I should create a page just because that search engine bot would crawl? And I am using Rails, specific recommendations would be nicer.

Thanks a lot.

Upvotes: 1

Views: 80

Answers (1)

blelump
blelump

Reputation: 3243

The best practice here is to deal with them with your proxy server (e.g. nginx). What you can do here:

  • create robots.txt file within your public directory and create appropriate rules (see here). However, since they're just rules, they do not have to be obeyed by any of them.
  • create nginx rule to reject requests looking like bots using $http_user_agent, e.g: Blocking all bots except a few with Nginx

Upvotes: 1

Related Questions