Lazer
Lazer

Reputation: 94830

How to limit the amount of memory accessible to my C code?

Just to test, I ran this code

#include<unistd.h>
#include<stdio.h>
#include<stdlib.h>

int main() {

    int **ar = (int**) malloc(100000000* sizeof(int*));

    int i;
    for(i = 0; i<10000000; i++) {
        ar[i] = (int*) malloc(1000 * 4);
        ar[i][123] = 456;
    }

    usleep(3000000); usleep(3000000);
    usleep(3000000); usleep(3000000);
    usleep(3000000); usleep(3000000);

    return 0;
}

The memory usage graph went like this (the bottom pink colored graph tracks memory).

alt text

Though this program did not run out of memory, any more memory requirements would lead to the malloc failing and then a segmentation fault due to the ar[i][123] = 456; line.

I want to put a limit on the memory allocation through my program, but also do not want to bind my program statically.

For example, Is there a way to tell my program to use atmost half of the memory available on the system (the system where the binary was run), but no more?

So, if my program is run on a machine with 8GB of memory, it can use a max of 4GB, but in case the program runs on a machine with 256MB, only 128MB should be available to the program.

Also, I want to do this from within my program, rather than controlling the amount of memory available using some external utility.

Upvotes: 2

Views: 427

Answers (3)

mpontillo
mpontillo

Reputation: 13947

I'm surprised no one mentioned ulimit, which is the way to go on a UNIX platform.

If you want a programming solution, wrap your malloc() and free() calls with a custom function and keep track of how much memory you have allocated. Have a command-line parameter (or environment variable, configuration file setting, etc) the user can set, to allow THEM to decide how much RAM your program is allowed to use.

Upvotes: 2

R.. GitHub STOP HELPING ICE
R.. GitHub STOP HELPING ICE

Reputation: 215259

It's generally considered abusive for a program to attempt to detect the amount of physical memory on a machine and use a certain portion of that. What if 10 programs each think they're entitled to make use of half the machine's memory?

A much better approach to dealing with huge volumes of data would be to work with it on disk, and rely on the operating system's filesystem cache to make optimal use of the machine's physical memory in a way that's fair to multiple processes.

Upvotes: 2

John Carter
John Carter

Reputation: 55271

If you're on Linux, you can parse the file /proc/meminfo on startup to find out how much memory the system has:

$ cat /proc/meminfo
MemTotal:         505872 kB
MemFree:           70332 kB
...

Upvotes: 0

Related Questions