Scott
Scott

Reputation: 152

Convert Vector of Binary Integers to Vector of Digits

So I'm having trouble with an assignment for class. The goal is to take a vector of length n filled with only binary integers (0 or 1). Ex. [1,1,0,1] where v[0] = 1, v[1] = 1, v[2] = 0, v[3] = 1. The output of the function ItBin2Dec outputs a vector of digits representing the same integer. So [1,1,0,1] => [1,3] (for 13). I'm not a great programmer so I tried to follow along with the algorithm given to us. Any advice would be greatly appreciated.

/* Algorithm we are given

function ItBin2Dec(v)
Input: An n-bit integer v >= 0 (binary digits)
Output: The vector w of decimal digits of v

Initialize w as empty vector
if v=0: return w
if v=1: w=push(w,1); return w
for i=size(v) - 2 downto 0:
  w=By2inDec(w)
  if v[i] is odd: w[0] = w[0] + 1
return w

*/

#include <vector>
#include <iostream>

using namespace std;

vector<int> ItBin2Dec(vector<int> v) {
    vector<int> w; // initialize vector w
    if (v.size() == 0) { // if empty vector, return w
        return w;
    }
    if (v.size() == 1) { // if 1 binary number, return w with that number
        if (v[0] == 0) {
            w.push_back(0);
            return w;
        }
        else {
            w.push_back(1);
            return w;
        }
    }
    else { // if v larger than 1 number
        for (int i = v.size() - 2; i >= 0; i--) {
            w = By2InDec(w); // this supposedly will multiply the vector by 2
            if (v[i] == 1) { // if v is odd
                w[0] = w[0] + 1;
            }
        }
    }
    return w;
}

vector<int> By2InDec(vector<int> y) {
    vector<int> z;
    // not sure how this one works exactly
    return z;
}

int main() {
    vector<int> binVect; // init binary vect
    vector<int> decVect; // init decimal vect

    decVect = ItBin2Dec(binVect); // calls ItBin2Dec and converts bin vect to dec vect
    for (int i = decVect.size(); i >= 0; i--) { // prints out decimal value
        cout << decVect[i] << " ";
    }
    cout << endl;



    return 0;
}

It's been a while since I've had to code anything so I'm a bit rusty. Obviously I haven't set it up with actual inputs yet, just trying to get the skeleton first. The actual assignment asks for multiplication of vectors of binary digits and then output the resulting vector of digits but I figured I would start with this first and work from there. Thanks!

Upvotes: 0

Views: 3067

Answers (1)

Manu343726
Manu343726

Reputation: 14174

Converting a number from binary to decimal is easy, so I suggest you to first compute the decimal number, and later get the digits of the number:

int binary_to_decimal(const std::vector<int>& bits)
{
    int result = 0;
    int base = 1;

    //Supposing the MSB is at the begin of the bits vector:
    for(unsigned int i = bits.size()-1 ; i >= 0 ; --i)
    {
        result += bits[i]*base;
        base *= 2;
    }

    return result;
}

std::vector<int> get_decimal_digits(int number)
{
    //Allocate the vector with the number of digits of the number:
    std::vector<int> digits( std::log10(number) - 1 );

    while(number / 10 > 0)
    {
        digits.insert( digits.begin() , number % 10);
        number /= 10;
    }

    return digits;
}


std::vector<int> binary_digits_to_decimal_digits(const std::vector<int>& bits)
{
    return get_decimal_digits(binary_to_decimal(bits));
}

Upvotes: 1

Related Questions