Reputation: 14803
Is there a way to prevent users from locking up a linux machine with code something along the lines of:
#import <stdio.h>
int main (int argc, char** argv)
{
while (1)
fork();
}
The computers in question are in a computer lab, so I can't exactly disallow compiling... but is there some way of ensuring such processes only consume a certain portion of the system resources? The importance of this issue is compounded by the fact that any user can ssh into any of the systems, so really the only reason this hasn't become a problem yet is most users are more or less unfamiliar with C or other low-level languages.
Still, I'd like to nip this one in the bud...
Upvotes: 2
Views: 373
Reputation: 881303
You can limit the total number of concurrent processes that each user is allowed to create. I think it's in /etc/security/limits.conf
and the NPROC
field is what you need to set.
Update: Just looked it up here and it appears my memory isn't failing me after all :-)
The simplest way is to enter:
* hard nproc 50
which will limit all users to 50 processes. You may want to have a little more fine-grained control than that.
Alternatively, you can use ulimit
to enforce the limit if limits.conf
is not available on your system. You will have to ensure that all started processes are restricted by, for example, putting it into /etc/profile
and all other possible entry points:
ulimit -Hu 50
Upvotes: 11