Reputation: 65163
string convert_binary_to_hex(string binary_value, int number_of_bits)
{
bitset<number_of_bits> set(binary_value);
ostringstream result;
result << hex << set.to_ulong() << endl;
return result.str();
}
In the above method, I am converting binary strings to hex strings. Since hex values are 4 bits, the number_of_bits
variable needs to be a multiple of 4 because the binary_value
could range anywhere from 4 bits to 256 bits with the application I'm writing.
How do I get bitset to take a variable size?
My imports:
#include <stdio.h>
#include <iostream>
#include <string>
#include <bitset>
#include <sstream>
Upvotes: 7
Views: 5997
Reputation: 283713
You could use the fact that one 8-bit character always needs exactly two hex digits. There's no need to turn the entire string into a bit sequence at the same time, the string elements can be processed separately.
string convert_octets_to_hex(string value)
{
string result(2*value.size());
for( int i = 0; i < value.size(); i++ ) {
result[2*i] = "0123456789abcdef"[(value[i] >> 4) & 0x0f];
result[2*i+1] = "0123456789abcdef"[value[i] & 0x0f];
}
return result;
}
Oh, I see you have a 1-bit character string. Can handle that the same way:
string convert_binary_to_hex(string binary_value, int number_of_bits = -1)
{
if (number_of_bits < 0) number_of_bits = binary_value.size();
string result((number_of_bits + 3) / 4, '\0');
unsigned work;
char* in = &binary_value[0];
char* out = &result[0];
if (number_of_bits & 3) {
work = 0;
while (number_of_bits & 3) {
work <<= 1;
work |= *(in++) & 1;
number_of_bits--;
}
*(out++) = "0123456789abcdef"[work];
}
while (number_of_bits) {
work = ((in[0] & 1) << 3) | ((in[1] & 1) << 2) | ((in[2] & 1) << 1) | (in[3] & 1);
in += 4;
*(out++) = "0123456789abcdef"[work];
number_of_bits -= 4;
}
return result;
}
EDIT: Fixed some bugs, added a demo
Upvotes: 1
Reputation: 4887
Make your method a template as well if you can know the size at compile time, otherwise you'll need to use std::vector<bool>
which is actually specialized to use only one bit per bool
anyways, but you'll have to build the ulong
manually using or's and bitshifts.
//template version
template <size_t number_of_bits>
string convert_binary_to_hex(string binary_value) {
bitset<number_of_bits> set(binary_value);
ostringstream result;
result << hex << set.to_ulong() << endl;
return result.str();
}
But since you're already assuming that a ulong
will be large enough to hold the number of bits and since it won't make a difference if you have too many bits for the code you've given, why not just make it the size of a ulong
?
//reuses ulong assumption
string convert_binary_to_hex(string binary_value) {
bitset<sizeof(ulong)> set(binary_value);
ostringstream result;
result << hex << set.to_ulong() << endl;
return result.str();
}
Or else you can just have 2 functions, one that performs the actual conversion for a 4 bit number, and another that uses that function to build up arbitrary length numbers:
string convert_nibble_to_hex(string binary_value) {
bitset<4> set(binary_value);
ostringstream result;
result << hex << set.to_ulong() << endl;
return result.str();
}
string convert_binary_to_hex(string binary_value) {
//call convert_nibble_to_hex binary_value.length()/4 times
//and concatenate results
}
Upvotes: 2
Reputation: 29490
As far as I remember you can use templates to get around this issue:
template <size_t number_of_bits>
string convert_binary_to_hex(string binary_value)
{
bitset<number_of_bits> set(binary_value);
ostringstream result;
result << hex << set.to_ulong() << endl;
return result.str();
}
then call it like this:
convert_binary_to_hex<32>(12345678);
Note that you still can pass constants only, but every call can get another constant now :)
Upvotes: 2
Reputation: 46627
You don't - std::bitset
is a template, its size must be specified at compile-time.
You need to make convert_binary_to_hex
a template on its own. If the size is only known at runtime, you have to find another solution.
template<size_t number_of_bits>
string convert_binary_to_hex(string binary_value)
{
bitset<number_of_bits> set(binary_value);
ostringstream result;
result << hex << set.to_ulong() << endl;
return result.str();
}
Upvotes: 3
Reputation: 34418
You can't. Template parameters like that need to be known at compile time since the compiler will need to generate different code based on the values passed.
In this case you probably want to iterate through your string instead and build up the value yourself, e.g.
unsigned long result = 0;
for(int i = 0; i < binary_value.length(); ++i)
{
result <<= 1;
if (binary_value[i] != '0') result |= 1;
}
which also assumes that your result is shorter than a long, though, and won't accommodate a 256-bit value - but neither will your sample code. You'll need a big-number type for that.
Upvotes: 8
Reputation: 133044
std::bitset
's size can only be a compile-time known constant (constant expression) because it is an integral template parameter. Constant expressions include integral literals and/or constant integer variables initialized with constant expressions.
e.g.
std::bitset<4> q; //OK, 4 is a constant expression
const int x = 4;
std::bitset<x> qq; //OK, x is a constant expression, because it is const and is initialized with constant expression 4;
int y = 3;
const int z = y;
std::bitset<z> qqq; //Error, z isn't a constant expression, because even though it is const it is initialized with a non-constant expression
Use std::vector<bool>
or boost::dynamic_bitset
(link here) instead for dynamic (not known compile-time) size.
Upvotes: 5