Reputation: 103
My rails sever keeps getting attaced by a large number of invalid url requests for a while. I think the attack doesn't impact on server too much, except huge amount of logs.
ActionController::RoutingError (No route matches [GET] "/xmlrpc.php"): app_serv_green_1 | app_serv_green_1 | actionpack (5.1.4) lib/action_dispatch/middleware/debug_exceptions.rb:63:in `call' ActionController::RoutingError (No route matches [GET] "/wp-login.php"): app_serv_green_1 | app_serv_green_1 | actionpack (5.1.4)
What is the best way to deal with this situation?
Upvotes: 2
Views: 519
Reputation: 26294
Yes those are bots scanning all web sites looking for vulnerabilities. Install fail2ban which will scan the log file and place their IP address on a blacklist in the Linux firewall with iptables
for a given amount of time. You can set the regular expression of what to search for and what to ban.
Fail2ban scans log files (e.g. /var/log/apache/error_log) and bans IPs that show the malicious signs -- too many password failures, seeking for exploits, etc. Generally Fail2Ban is then used to update firewall rules to reject the IP addresses for a specified amount of time, although any arbitrary other action (e.g. sending an email) could also be configured. Out of the box Fail2Ban comes with filters for various services (apache, courier, ssh, etc).
Use yum install epel-release
and yum install fail2ban
to install it on CentOS.
Upvotes: 1
Reputation: 3388
The example you posted looks like a bot looking for a way into WP sites. You could look at using a service that blocks repetitive attempts from bots (ip blacklisting). This way you're not excluding from your logs which might prove to be useful in the future. I use a service at https://www.sqreen.io, but there are a number of options available in this field.
Upvotes: 1
Reputation: 160
You can write a rule at the end of routes.rb:
get '*', to: 'invalid#404'
and this return 404 response code in head.
Upvotes: 2