some_id
some_id

Reputation: 29906

Do Compilers implicitly convert int types

If a value is set to an int e.g. 2, does the compiler convert the int types to the size it needs e.g. int8_t or uint16_t etc.?

Upvotes: 0

Views: 277

Answers (5)

Squall
Squall

Reputation: 4472

Integer numbers are values of the type "int". When you assign an integer value to short or char using the operator "=", the int value will be converted to short or char. The compiler may detect this conversion and do optimization to convert the integer value on compile time.

short a = 50; //50 is an int, it will be implicitly converted to short. The compiler may convert 50 to short on compile time.
int b = 60;
short c = b; //the value of b will be converted to short and assigned to c.
short d = b + 70; //b + 70 is a sum of ints and the result will be an int that will be converted to short.

int8_t and uint16_t are not standard types. Many times these types may be defined as something like:

typedef char int8_t;
typedef unsigned short uint16_t;

Upvotes: 0

edgar.holleis
edgar.holleis

Reputation: 5001

The compiler first looks at the context of the expression, learning about what type it expects. The context can be i.e.

  • Left hand side of an assignment
  • Expected argument type provided by the function header or operator definition

It then evaluates the expression, inserting implicit type conversions as needed (type coercion). It does

  • Promotion
  • Truncation
  • Rounding

In situations where all the bits matter, you need to be extremely carefull about what you write: Types, operators and order.

Upvotes: 0

misosim
misosim

Reputation: 11

For constants it may true, often the reverse conversion small to large is also done: byte to int for example.

It's somewhat dependent on the implementation and optimization techniques used by the compiler and the data alignment requirements by the architecture/OS. Have a look at this write up.

Upvotes: 0

Simone
Simone

Reputation: 11797

If you write

int value = 2;

the type is, by default, signed int. What the compiler does then really depends from the platform, but it has to guarantee that int's size is not less than short int's and not greater than long int'.s

Upvotes: 0

Pål Brattberg
Pål Brattberg

Reputation: 4698

Not in vanilla C, no. The compiler can't possible know what you meant if you do not tell it.

Upvotes: 1

Related Questions