Joe DF
Joe DF

Reputation: 5548

String termination - char c=0 vs char c='\0'

When terminating a string, it seems to me that logically char c=0 is equivalent to char c='\0', since the "null" (ASCII 0) byte is 0, but usually people tend to do '\0' instead. Is this purely out of preference or should it be a better "practice"?

What is the preferred choice?


EDIT: K&R says: "The character constant '\0' represents the character with value zero, the null character. '\0' is often written instead of 0 to emphasize the character nature of some expression, but the numeric value is just 0.

Upvotes: 55

Views: 121355

Answers (4)

Chris Bao
Chris Bao

Reputation: 2868

The above answers are already quite clear. I just share what I learned about this issue with a demo.

#include <stdlib.h>
#include <stdio.h>


char*
mystrcat(char *dest, char *src) {
    size_t i,j;
    for(i = 0; dest[i] != '\0'; i++)
        ;
    for(j = 0; src[j] != '\0'; j++)
        dest[i+j] = src[j];
    dest[i+j] = '\0';
    return dest;
}

int main() {
    char *str = malloc(20); // malloc allocate memory, but doesn't initialize the memory
    // str[0] = '\0'; 
    str[0] = 0;
    for (int k = 0; k <10; k++) {
        char s[2];
        sprintf(s, "%d", k);
        mystrcat(str, s);
    }
    printf("debug:%s\n", str);
    return 0;
}

In the above program, I used malloc to initialize the pointer, but malloc doesn't initialize the memory. So after the mystrcat operation(which is nearly the same as the strcat function in glibc), the string may contain mess code(since the memory content is not initialized).

So I need to initialize the memory. In this case str[0] = 0 and str[0] = 0 both can make it work.

Upvotes: 0

Nobilis
Nobilis

Reputation: 7448

http://en.wikipedia.org/wiki/Ascii#ASCII_control_code_chart

Binary   Oct  Dec    Hex    Abbr    Unicode  Control char  C Escape code   Name
0000000  000  0      00     NUL     ␀       ^@            \0              Null character

There's no difference, but the more idiomatic one is '\0'.

Putting it down as char c = 0; could mean that you intend to use it as a number (e.g. a counter). '\0' is unambiguous.

Upvotes: 48

Pupkov-Zadnij
Pupkov-Zadnij

Reputation: 1392

Preferred choice is that which can give people reading your code an ability to understand how do you use your variable - as a number or as a character. Best practice is to use 0 when you mean you variable as a number and to use '\0' when you mean your variable is a character.

Upvotes: 6

Michal Bukovy
Michal Bukovy

Reputation: 420

'\0' is just an ASCII character. The same as 'A', or '0' or '\n'
If you write char c = '\0', it's the same aschar c = 0;
If you write char c = 'A', it's the same as char c = 65

It's just a character representation and it's a good practice to write it, when you really mean the NULL byte of string. Since char is in C one byte (integral type), it doesn't have any special meaning.

Upvotes: 20

Related Questions