Sideshow Bob
Sideshow Bob

Reputation: 4716

Limiting total web script accesses per second: good idea or not? how to implement?

I am wondering how one would limit accesses per second globally - for all IPs, not per IP - to prevent password cracking by a botnet or similar. This would effectively turn a password cracking attack into a DDOS attack.

Note that expected traffic to this script is very low (1 access per day).

For example

//pseudocode...
while (now() - last_access < 1 second)
    sleep(0.3 seconds)

last_access = now()

Two questions

  1. is this a good idea?
  2. how to do it in PHP? (as I don't have control over apache config)

Edit: If n users are trying to access the site at once, I am happy for each one to wait n seconds. They only need to load one page each.

Upvotes: 1

Views: 788

Answers (4)

Igal Zeifman
Igal Zeifman

Reputation: 1146

Think about it.

This is basically a "How-to-DDoS-Myself" explanation.

If you set a limit and prevent access from that point on, you are simply crashing your own site - which is an end-goal of a DDoS attack anyway. Also, setting such an artificial limit will crash your site even faster than any DDoS would (as the script will need to activate before the real "hard limit" is reached).

DDoS mitigation deals with a question: "How to stay operational during a DDoS attack?" and shutting yourself down is not an answer.

It's like curing headache with cyanide - it will "technically" work, but I wouldn`t call it a cure.

Upvotes: 1

germi
germi

Reputation: 4658

If you're absolutely sure that this site is only visited once a day legitimately, you can do so, of course.

You could use PHP to write to a txt-file and store the last access there - that is, if you don't have access to a database or some other means of storing that information.

As you mentioned yourself, this could lead to your site being permanently down as even a very small attack (3-4 hits per second can hardly be called a DOS-attack) would suffice to lock you out.

Upvotes: 2

RandomDuck.NET
RandomDuck.NET

Reputation: 490

There are better ways to prevent bots. First, forcing users to not be able to use your site quickly isn't a very good idea. Instead, using the same sort-of approach, you can test to see how long it's been since an IP tried to log-in. If it's, for example, under 1 second, don't even process the input and just output "Slow down!". A better way is to use a captha-similar method, like a simple math problem.

Upvotes: 1

Madara&#39;s Ghost
Madara&#39;s Ghost

Reputation: 174977

Well, first, you'll need to figure out your threshold. What would count as a higher load, and what would count as a DDOS attack?

Second, you'll need to save the last_access variable somewhere more permanent, like a database.

But yes, what you're looking for is possible, and is recommended if your site is under frequent attacks.

Instead of sleep (which is useless in the case of an attack), simply show a generic

<h1>The server is currently experiencing extreme load. Please try again later</h1>

Your threshold should be something more like 5 users in the last couple of seconds (In which case you need to log all accesses over the last few seconds, and do the math yourself.


A better approach is to enforce user registration, and use Captchas when registering.

Upvotes: 2

Related Questions