Reputation: 341
Recently my server got attacked from a bad website crawler who searched for ~70.000 folders on my website in ~2 minutes. This caused some server-lags and it might also be a problem if it find some exploits in phpmyadmin or something else.
Now my question: Is there a fail2ban config which can block such scans?
I already found a older one (link), but it doesn't match the log strings in my log.
The strings in my log look like:
domain.tld:80 <Attacker IP (v4 & v6)> - - [17/Apr/2017:20:45:28 +0200] "GET /ph4/ HTTP/1.1" 404 4123 "http://www.google.com" "Googlebot/2.1 (+http://www.google.com/bot.html)"
Upvotes: 1
Views: 704