Learner
Learner

Reputation: 157

How can I understand the result of this testcase in this code challenge?

I am trying to understand the first testcase of this challenge in codeforces.

The description is:

Sergey is testing a next-generation processor. Instead of bytes the processor works with memory cells consisting of n bits. These bits are numbered from 1 to n. An integer is stored in the cell in the following way: the least significant bit is stored in the first bit of the cell, the next significant bit is stored in the second bit, and so on; the most significant bit is stored in the n-th bit. Now Sergey wants to test the following instruction: "add 1 to the value of the cell". As a result of the instruction, the integer that is written in the cell must be increased by one; if some of the most significant bits of the resulting number do not fit into the cell, they must be discarded. Sergey wrote certain values ​​of the bits in the cell and is going to add one to its value. How many bits of the cell will change after the operation?

Summary

Given a binary number, add 1 to its decimal value, count how many bits change after the operation?

Testcases

4

1100
= 3

4

1111
= 4

Note In the first sample the cell ends up with value 0010, in the second sample — with 0000.

In the 2 test case 1111 is 15, so 15 + 1 = 16 (10000 in binary), so all the 1's change, therefore is 4

But in the 2 test case 1100 is 12, so 12 + 1 = 13 (01101), here just the left 1 at the end changes, but the result is 3 why?

Upvotes: 2

Views: 104

Answers (1)

BartoszKP
BartoszKP

Reputation: 35911

You've missed the crucial part: the least significant bit is the first one (i.e. the leftmost one), not the last one, as we usually write binary.

Thus, 1100 is not 12 but 3. And so, 1100 + 1 = 3 + 1 = 4 = 0010, so 3 bits are changed.

The "least significant bit" means literally a bit that is not the most significant, so you can understand it as "the one representing the smallest value". In binary, the bit representing 2^0 is the least significant. So the binary code in your task is written as follows:

bit no. 0    1    2    3    4   (...)
value   2^0  2^1  2^2  2^3  2^4 (...)
        | least             | most
        | significant       | significant
        | bit               | bit

that's why 1100 is:

1100 = 1 * 2^0 + 1 * 2^1 + 0*2^2 + 0*2^3 = 1 + 2 + 0 + 0 = 3

not the other way around (as we write usually).

Upvotes: 2

Related Questions