Steve
Steve

Reputation: 51

IP Banning - most efficient way?

I run a large forum and like everyone else have issues with spammers/bots. There are huge lists of known spam IP's that you can download and use in htaccess form, but my only concern is the file size. So I suppose the question is how big is too big, given it's going to be loading in for every user. Adding all the IP's in it gets to about 100kb.

Is there an alternative that would have less overhead? Possibly doing it with php, or will that result in some heavy load too due to file size and checking ips etc?

Any advice would be greatly appreciated.

Thanks,

Steve

Upvotes: 5

Views: 3233

Answers (8)

Lex Podgorny
Lex Podgorny

Reputation: 2950

In .htaccess in your DocumentRoot, after:

Order Deny,Allow

Append a line:

Deny from <black ip>

Upvotes: 0

Peeter
Peeter

Reputation: 9382

Why force the webserver to handle blocking users? I'd suggest using null routes (as using iptables will slow your server down if the amount of blocked IP entries grows).

Read up on http://www.cyberciti.biz/tips/how-do-i-drop-or-block-attackers-ip-with-null-routes.html

http://php.net/manual/en/function.shell-exec.php

Upvotes: 0

initall
initall

Reputation: 2385

Don't use such IP lists. They're likely to get outdated and you might block the wrong requests. Just invest in good or better captchas and only block IPs from time to time if they're really doing some kind of denial of service attack.

Upvotes: 0

Kissaki
Kissaki

Reputation: 9237

There are often more efficient ways than IP bans. For example, hidden fields in a form only bots will fill out, or requiring javascript or cookies for submitting forms.

For IP banning, I wouldn’t use .htaccess files. Depending on your webserver it may read the htaccess files for each request. I’d definitely add the IP-bans into your webservers vhost configuration instead. That way I’d be sure the webserver will keep it in RAM and not read it again and again.

Doing it via PHP would also be an option. This way, you could also easily limit the bans to forms, like registration in your forum.

Upvotes: 3

yankee
yankee

Reputation: 40850

Unless you already have problems with the load on your server you will probably not notice the difference from a 100K .htaccess file. There may be faster alternatives, perhaps including the use of iptables or the use of sorted ip lists that can be searched faster for matches, or even the use of a database (though the overhead of a single database query might crush the benefit of indexed tables) but it is probably not worth the effort unless you run a forum with high loads.

You can alternatively try to use captcha's or similar. Everything in this direction comes at an expense and nothing is 100% reliable.

Upvotes: 0

Ryan Fernandes
Ryan Fernandes

Reputation: 8536

maybe you want to stop spam the good-ole-fashioned-way - Captcha ?

I believe that a Mr. Albert Einstein once said: Problems cannot be solved at the same level of awareness that created them :)

Upvotes: 0

mario
mario

Reputation: 145512

There are a few options:

  • You can store the block list into the database. It's more effecient to query there than with a loop in PHP.
  • You could pre-process the list with array_map(ip2long()) to save memory and possibly lookup time.
  • You could package the IP list into a regular expression, maybe run it though an optimizer (Perl Regexp::Optimizer). PCRE testing would again be faster than a foreach and strpos tests. $regex = implode("|", array_map("preg_quote", file("ip.txt")));

But then, IP block lists are not often very reliable. Maybe you should implement the other two workarounds: hidden form fields to detect dumb bots. Or captchas to block non-humans (not very user-friendly, but solves the problem).

Upvotes: 2

Piskvor left the building
Piskvor left the building

Reputation: 92792

Well, you are building a database of addresses, right? Wouldn't it be useful to use a database product for it? If you don't have any yet, SQLite could be up to the task.

Upvotes: 0

Related Questions