Kenneth
Kenneth

Reputation: 1167

Mixing typedefs and defines to force data type sizes

I'm doing module testing of code written for a different platform and are having a problem where I need to constrict the sizes of the data types in the module being tested. Since I can't modify the module file directly I thought of using stdint.h typedefs and replace the modules declaration using defines. In essense this:

#include <stdint.h>
#define int int16_t

int main() {
    uint16_t ui = 2;
    unsigned int uii = 3;
    printf("Hello\n");
    printf("Test %d, %d\n", ui, uii);
    return 0;
}

However this fails with this message:

error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘uii’

Is there another way to do this kind of type replacement?

Upvotes: 2

Views: 527

Answers (3)

Steve Jessop
Steve Jessop

Reputation: 279255

The signed, unsigned, long, short modifiers can only be applied to the keywords that denote built-in types. They can't be applied to a typename that happens to be an alias for a built-in type: the C syntax doesn't allow it.

As for what you can do, I think Mysticial's answer covers it. Changing the size of int will conflict with whatever ABI the compiler is using to make library and system calls, so you can't really do it without compiler support. For example suppose you have a function declared as follows:

int foo(int a);

If you replace all mentions of int with short in the TU that calls that function, but not the TU that implements it, then the caller will pass and receive a short, whereas the function implementation expects and returns an int. This won't necessarily work. You need all libraries, including the standard libraries and any system calls they make, to be compiled such that caller and callee agree what an int is.

One option of course is to change all the code you're testing, to use macros in place of int, unsigned int, and so on. Then any function declarations in headers are left alone. There will be implicit conversions when the types won't match, which might provoke compiler warnings and truncate values, but at least has defined behavior. Basically it's dependency injection via the preprocessor.

Upvotes: 1

Mysticial
Mysticial

Reputation: 471229

For why your particular example fails it's because it's being expanded to:

unsigned int16_t uii = 3;

with which the unsigned modifier apparently doesn't work on int16_t.

Now to answer the question: I don't think you can do this unless the compiler has an internal option to change the size of int. Trying to force it will clash with internal library functions.

For example: Your printf() will also break because %d is going to expect the normal int, but you pass it a 16-bit integer.

EDIT: This printf() example is not a good example since int16_t will be promoted to int. But the general idea still holds. (see comments)

Upvotes: 3

Vms
Vms

Reputation: 1

The key is to use

#define int short

instead.

Upvotes: 0

Related Questions