Reputation: 2919
Why is this program showing the following output ?
#include <bitset>
...
{
std::bitset<8> b1(01100100); std::cout<<b1<<std::endl;
std::bitset<8> b2(11111111); std::cout<<b2<<std::endl; //see, this variable
//has been assigned
//the value 11111111
//whereas, during
//execution, it takes
//the value 11000111
std::cout << "b1 & b2: " << (b1 & b2) << '\n';
std::cout << "b1 | b2: " << (b1 | b2) << '\n';
std::cout << "b1 ^ b2: " << (b1 ^ b2) << '\n';
}
This is the OUTPUT:
01000000
11000111
b1 & b2: 01000000
b1 | b2: 11000111
b1 ^ b2: 10000111
First, I thought there is something wrong with the header file (I was using MinGW) so I checked using MSVCC. But it too showed the same thing. Please help.
Upvotes: 14
Views: 1257
Reputation: 500307
Despite the appearance, the 11111111
is decimal. The binary representation of 11111111
10 is 101010011000101011000111
2. Upon construction, std::bitset<8>
takes the eight least significant bits of that: 11000111
2.
The first case is similar except the 01100100
is octal (due to the leading zero). The same number expressed in binary is 1001000000001000000
2.
One way to represent a bitset with a value of 11111111
2 is std::bitset<8> b1(0xff)
.
Alternatively, you can construct a bitset from a binary string:
std::bitset<8> b1(std::string("01100100"));
std::bitset<8> b2(std::string("11111111"));
Upvotes: 48
Reputation: 6260
As per NPE's answer, you are constructing the bitset
with an unsigned long
, and not with bits as you were expecting. An alternative way to construct it, which enables you to specify the bits, is by using the string
constructor as follows:
#include <bitset>
#include <cstdio>
#include <iostream>
int main()
{
std::bitset<8> b1(std::string("01100100")); std::cout<<b1<<std::endl;
std::bitset<8> b2(std::string("11111111")); std::cout<<b2<<std::endl;
std::cout << "b1 & b2: " << (b1 & b2) << '\n';
std::cout << "b1 | b2: " << (b1 | b2) << '\n';
std::cout << "b1 ^ b2: " << (b1 ^ b2) << '\n';
getchar();
return 0;
}
Click here to view the output.
Upvotes: 7