Reputation: 49
In C, I'm trying to create an integer array with 1,000,000,000 (1 billion) elements. I tried to use;
#include <stdio.h>
#include <stdlib.h>
#include <ctype.h>
#include <string.h>
#include <math.h>
#include <time.h>
#define population 1000000000
int numberOfHandshaking;
int main(int argc, char **argv) {
printf("******************************\n");
printf(" FRIEND CIRCLE QUERIES\n");
printf("******************************\n\n");
FILE *fPointer;
fPointer = fopen("C:\\Users\\ASUS\\Desktop\\AdvProTech2\\uebung_6\\Ass13in.txt", "r");
fscanf(fPointer, "%d", &numberOfHandshaking); // Reads the first line
printf("numberOfHandshakings: %d\n", numberOfHandshaking);
int *persons = malloc((population + 1) * sizeof(int));
for (int i = 0; i < 127; i++) {
printf("persons[%d]= %d \n", i, persons[i]);
}
printf("\n\n\n");
return 0;
}
it does not give error when compiling but crashes when trying to reach the elements. Can anyone help me?
Upvotes: 1
Views: 520
Reputation: 12668
First of all, you don't say in which architecture you are trying this approach. Second, you don't say what operating system you are using. You only say you are trying to malloc(3)
an array of one billion int
s.
Think twice: a single array of 1 000 0000 001 int
elements is 4 000 000 004 in size. If you are trying to handle this in a 32 bit architecture, you are asking the system to allocate the full virtual memory space (which is 4Gb) just for your array. Most probably you have a very hard limit that impedes you to allocate such amount of memory.
In 64bit architecture you have enough addresses to handle that, but again, probably your operating system will refuse to create a single contiguous mapping of such size (or the amount of virtual memory your system can handle). The most probable thing is that you'll not be able to get that memory. I see you are running this in windows (by the path you are using for the file name) so most probably you cannot even check how much memory you can handle per process.
I have tested your program in a FreeBSD 64bit architecture, in which I have the following limits (I have to admit I have tweaked them a little to handle your problem for my user account, as a normal user in a multiuser system is never granted such limits, as default):
$ ulimit -a
number of pseudoterminals (-P) unlimited
socket buffer size (bytes, -b) unlimited
core file size (blocks, -c) unlimited
data seg size (kbytes, -d) 33554432
file size (blocks, -f) unlimited
max kqueues (-k) unlimited
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 230121
pipe size (512 bytes, -p) 1
stack size (kbytes, -s) 524288
cpu time (seconds, -t) unlimited
max user processes (-u) 12042
virtual memory (kbytes, -v) unlimited
swap size (kbytes, -w) unlimited
(as you can see, I have a maximum data segment size limit of 33Gb, so, in case I had asked for 10 billion elements array, I had failed, and need to say that that limit is imposed by the kernel, I cannot raise it --even as root)
SIGSEGV
because the file pointer returned by fopen(3)
gave NULL
(the file didn't exist, another thing you don't check in your program) so the program was aborted on the fscanf(3)
call. Note: You need to check the return values of calls for errors.$ pru
*****************************************
THIS WAS FRIEND CIRCLE QUERIES
*****************************************
numberOfHandshakings: 99885
persons[0]= 0
persons[1]= 0
persons[2]= 0
[...]
persons[123]= 0
persons[124]= 0
persons[125]= 0
persons[126]= 0
Then I fixed a maximum virtual memory space of 4Gb, and repeated the test:
$ pru
*****************************************
THIS WAS FRIEND CIRCLE QUERIES
*****************************************
numberOfHandshakings: 99885
malloc failed: Cannot allocate memory
this time, the result to ulimit -a
was:
$ ulimit -a
number of pseudoterminals (-P) unlimited
[... same as before ]
max user processes (-u) 12042
virtual memory (kbytes, -v) 4194304 <<<<<< 4Gb virtual memory.
swap size (kbytes, -w) unlimited
Your program, after the modifications I did on it to be able to process the errors returned by the calls to fopen()
and malloc()
was:
#include <ctype.h>
#include <errno.h>
#include <math.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <time.h>
#define population 1000000000
int numberOfHandshaking;
int main(int argc, char **argv)
{
printf("*****************************************\n");
printf(" THIS WAS FRIEND CIRCLE QUERIES\n");
printf("*****************************************\n\n");
FILE *fPointer;
fPointer =
fopen("Ass13in.txt", /* I changed this, my apologies */
"r");
if (fPointer == NULL) {
printf("couldn't open file: %s\n", strerror(errno));
exit(1);
}
fscanf(fPointer, "%d", &numberOfHandshaking); // Reads the first line
printf("numberOfHandshakings: %d\n", numberOfHandshaking);
int *persons = malloc((population+1) * sizeof(int));
if (persons == NULL) {
printf("malloc failed: %s\n", strerror(errno));
exit(1);
}
for (int i=0; i<127; i++){
printf("persons[%d]= %d \n", i, persons[i]);
}
printf("\n\n\n");
return 0;
}
and it run fine!! :)
In your program you #include
files that you don't need to. Those are:
#include <ctype.h>
#include <math.h>
#include <time.h>
Upvotes: 1
Reputation: 23218
If you do not know the needed size of a buffer before runtime, smaller bites are preferable when allocating memory than very very large chunks, where the likelihood of failing due to lack of contiguous memory available is high.
Start off with calloc/malloc
, then use realloc
:
// return '*str' after number of bytes realloc'ed to 'size'
static char * ReSizeBuffer(char **str, unsigned int size)
{
char *tmp=NULL;
if(!(*str)) return NULL;
if(size == 0)
{
free(*str);
return NULL;
}
tmp = (char *)realloc((char *)(*str), size);
if(!tmp)
{
free(*str);
return NULL;
}
*str = tmp;
return *str;
}
Upvotes: 0
Reputation: 67376
When working with ridiculous numbers like these, compile your program in 64 bits. Or simply write better code, because I can guarantee you that you don't need that array allocated like that.
Upvotes: 2