PyariBilli
PyariBilli

Reputation: 531

How to assign a specific array size in C header file

I have a constant called A defined in a header file; it is always an even number.

I would like to declare an array B in the same header file whose length depends on the value of A.

For example;

If A is 32, I would like the array size to be 4, (2,4,8,16 being the elements. All powers of 2 till A/2).

If A is 48, I would like the array size to be 5, (2,4,8,16,24 being the elements. All powers of 2 till A/2 and the last element being A/2).

What is an elegant way to calculate the array size of B given a value of A using the above logic?

Upvotes: 2

Views: 2379

Answers (3)

Jonathan Leffler
Jonathan Leffler

Reputation: 753675

You need to use a #define for A, but given that you do that, you can manage using some scheme like this:

#define A 384

#define B_SIZE (A <= 16 ? 3 : A <= 32 ? 4 : A <= 64 ? 5 : A <= 128 ? 6 : \
                A <= 256 ? 7 : A <= 512 ? 8 : A <= 1024 ? 9 : -1)

extern int B[B_SIZE];

The use of -1 as the tail value is not accidental; it ensures a compilation error because you can't have arrays with a negative size, so if A is too big, the code won't compile. You could use the same trick to rule out sizes that are too small, too.

In the code that defines and initializes the array, you use:

int B[B_SIZE] =
{
    2, 4,
#if A > 16
    8,
#endif
#if A > 32
    16,
#endif
#if A > 64
    32,
#endif
#if A > 128
    64,
#endif
#if A > 256
    128,
#endif
#if A > 512
    256,
#endif
    A/2
};

This isn't elegant, but I'm not sure there's a neater way to do it. This uses B_SIZE explicitly again to ensure a compilation error again if the value of A is out of bounds. You could otherwise just leave the B_SIZE out of the array specification.

You could easily write a shell (Awk, Perl, Python, …) script to generate the code up to a given size.

Upvotes: 3

Nisse Engstr&#246;m
Nisse Engstr&#246;m

Reputation: 4752

If we can limit A to always be a power of two, then this is related to another problem: "How do you compute the number of value bits in an integer type at compile time?" This problem was solved beautifully by Hallvard B Furuseth more than a decade ago. See: https://stackoverflow.com/a/4589384/3478852

Using Hallvard's solution, we can compute the array size as follows:

#include <stdio.h>

/* Number of bits in inttype_MAX, or in any (1<<k)-1 where 0 <= k < 2040 */
#define IMAX_BITS(m) ((m)/((m)%255+1) / 255%255*8 + 7-86/((m)%255+12))

#define A 1073741824

int B[IMAX_BITS(A/2-1)];

int main (void)
{
  printf ("%d\n", (int)(sizeof B / sizeof B[0]));
  return 0;
}

Note:

  • The number A must be a power of two.
  • The code uses the simpler version of the macro that is only accurate up to 2039 bits. There is another version in the article for ridiculously large numbers.

Upvotes: 1

Marian
Marian

Reputation: 7472

Maybe I am missing something. Why don't you simply define the array as:

int array[] = {
   2,4,8,16
#if A == 48
   ,24
#endif
};

#define DIM_OF_ARRAY (sizeof(array)/sizeof(array[0]))

Upvotes: 2

Related Questions