Tim
Tim

Reputation: 2163

Decode bits in C

I'm trying to decode binary that was encoded (most significant byte first) with the following loop.

int ch; // Has a value of, for example, 97 (which evaluates to 'a' with putchar(ch))
for (int i = 0; i < CHAR_BIT; i++) {
  printf("%d", !!((ch << i) & 0x80));
}

So far I have tried:

unsigned int byte[CHAR_BIT]; // Filled elsewhere
unsigned char result = 0;
for (int i = 0; i < CHAR_BIT; i++) {
  result |= (byte[i] == '1') << ((CHAR_BIT - 1) - i);
}
putchar(result);

But the output is wrong, it seems as if the characters were shifted the wrong amount. Assuming the first block of code is in a file called prog1 and the second is in prog2, the output of this shell command should be abc but it is `bb (literal back tick followed by bb).

echo "abc" | ./prog1 | ./prog2

Upvotes: 1

Views: 2667

Answers (1)

cdlane
cdlane

Reputation: 41852

This works for me:

prog1.c

#include <stdio.h>

#define CHAR_BIT 8

void encode(int c) {
    for (int i = 0; i < CHAR_BIT; i++) {
        printf("%d", !!((c << i) & 0x80));
    }
}

int main() {
    int c;

    while ((c = getchar()) != EOF) {
        encode(c);
    }

    printf("\n");

    return 0;
}

prog2.c

#include <stdio.h>
#include <string.h>

#define CHAR_BIT 8

void decode(char *byte) {
    int c = 0;

    for (int i = 0; i < CHAR_BIT; i++) {
        c |= (byte[i] == '1') << ((CHAR_BIT - 1) - i);
    }

    putchar(c);
}

int main() {
    char byte[CHAR_BIT + 1];

    while (scanf("%8s", byte) == 1) {
        decode(byte);
    }

    return 0;
}

EXAMPLE

> echo "abc" | ./prog1 
01100001011000100110001100001010
> echo "abc" | ./prog1 | ./prog2
abc
> 

If the encode/decode logic is the same as yours, then this line is suspect:

unsigned int byte[CHAR_BIT]; // Filled elsewhere

and knowing what transpired elsewhere might help to explain what went wrong.

Upvotes: 2

Related Questions