Reputation: 17577
I convert a character to binary using std::bitset<CHAR_BIT> binary('c');
but this doesn't work for a string,
std::string str = "MyString";
std::bitset<SIZE_OF_STRING_IN_BITS> binary(str); //Error Exception
what should be the alternative?
Upvotes: 2
Views: 545
Reputation: 40578
Why do you want to put the raw characters into a bitset? Why not a vector<char>
?
In any event you can get at the raw underlying bits of a string via the c_str()
member function which usually returns a raw char*
to the raw string data.
Upvotes: 3
Reputation: 47488
You could repeat this action over every character of the string, shifting the bitset by CHAR_BIT to the left every time:
#include <bitset>
#include <string>
#include <iostream>
#include <numeric>
#include <climits>
template<size_t N>
std::bitset<N> string_to_bitset(const std::string& s)
{
return accumulate(s.begin(), s.end(), std::bitset<N>(),
[](const std::bitset<N>& l, char r)
{
return std::bitset<N>(r) | l<<CHAR_BIT;
});
}
int main()
{
std::string str = "MyString";
const size_t SIZE_OF_STRING_IN_BITS = CHAR_BIT * 8;
std::bitset<SIZE_OF_STRING_IN_BITS> binary = string_to_bitset<SIZE_OF_STRING_IN_BITS> (str);
std::cout << binary << '\n';
}
Seeing as the size of the bitset has to be a constant expression, I would go with boost::dynamic_bitset
unless this string is a compile-time constant.
Upvotes: 1