Samy
Samy

Reputation: 1033

Someone can explain this?

I really dont understand this.Maybe someone can explain me . I wanted find out how many elements is in dataInput. I'm programming in C

void Calibrate_data(float* dataInput)
{   
    int dataSize = sizeof(*dataInput)/sizeof(float);
    printf("%i and %i",sizeof(*dataInput)/sizeof(float),dataSize);
}

The output is:

1 and 10

Upvotes: 5

Views: 160

Answers (3)

sujin
sujin

Reputation: 2853

When pass the array to function it will decay to pointer. You cannot calculate the size of the pointer, so sizeof will not give the number of element. If you want you can pass the array size as the arguments to the function.

Here *dataInput is the first element, It is float type . So sizeof(*dataInput) mean that sizeof(float).

sizeof(*dataInput)/sizeof(float) = 4/4 = 1;

Example Code:

#include<stdio.h>
void Calibrate_data(float* dataInput);
int main()
{
    float data[] = { 12.22, 15.15 };
    Calibrate_data(data);
    return 0;
}

void Calibrate_data(float *dataInput)
{   
    printf("float size : %zu %zu\n", sizeof(*dataInput), sizeof(float));
    int dataSize = sizeof(*dataInput)/sizeof(float);
    printf("%zu and %d\n",sizeof(*dataInput)/sizeof(float),dataSize);
}

Output:

float size : 4 4
1 and 1

EDIT: Output 10 might be the wrong usage of format specifer or output may be '1' that '0' printed from somewhere else. Because he not added newline character at end of the printf so we cannot assure that last 0 associated with this printf.

Upvotes: 2

Shafik Yaghmour
Shafik Yaghmour

Reputation: 158629

This is the wrong format specifier here:

printf("%i and %i",sizeof(*dataInput)/sizeof(float),dataSize);
        ^^

sizeof returns size_t type which is unsigned the correct format specifier is %zu or %Iu in Visual Studio.

Using the wrong format specifier invokes undefined behavior but that does not seem to explain the output of 10 for dataSize which does not make sense since sizeof(*dataInput) will be the size of a float. So we would expect sizeof(*dataInput)/sizeof(float) to be 1, as Macattack said an SSCCE should help resolve that output.

Upvotes: 4

Michael Burr
Michael Burr

Reputation: 340516

This is probably a problem with a 64-bit platform that uses 32-bit int and 64-bits for size_t (the type of sizeof's result).

In this scenario

printf("%i and %i",sizeof(*dataInput)/sizeof(float),dataSize);
         ^      ^  -------------------------------- --------
         |      |                  ^                    ^
         |      |                  |                    +---- 32-bit operand
         |      |                  +--- 64-bit operand
         |      |
         |      +--- expects 32-bit operand
         |
         +--- expects 32-bit operand

The mismatch in conversion specifiers and operands results in undefined behavior.

One of the following should fix the problem:

printf("%i and %i",(int) (sizeof(*dataInput)/sizeof(float)),dataSize);  // cast size_t type to int


// as mentioned by Shafik Yaghmour in his answer - http://stackoverflow.com/a/21266299/12711

// this might not be supported on compilers that don't claim C99 support, 
//  for example, MSVC docs indicate that "I" should be used instead of "z" for size_t types

printf("%zu and %i",sizeof(*dataInput)/sizeof(float),dataSize);  // C99 - use correct conversion spec
printf("%Iu and %i",sizeof(*dataInput)/sizeof(float),dataSize);  // MSVC - use 'correct' conversion spec

Upvotes: 2

Related Questions