Flave
Flave

Reputation: 77

Search binary number

I have defined some values, see below, and I can`t use them properly.

#define add 000001
#define sub 000010
#define jmp 000111 

#define IMM 10000
#define ADDR 10001

In my code, I set an address in hex.

            parameter1 = false;
            parameter1 = false; 

            uint64_t data = 0xffffffff05001e00;

            uint16_t vector[4];
            memcpy(vector, &data, sizeof(uint64_t));

            int currentPosition = 0;

            while (currentPosition < 4) {

                header = vector[currentPosition];//header

                opcode = header >> 0xA & 0x3F;
                src1 = header >> 0x5 & 0x1F;
                src2 = header & 0x1F;

                if (src1 == ADDR || src1 == IMM) { parameter1 = true; }
                if (src2 == ADDR || src2 == IMM) { parameter2 = true; } 
                ....
                currentPosition++;
           }

header = 1e00 in this case ( because it`s vector[0] )

Forward it will do: opcode = 0x7, src1 =0x10, src2= 0x0.

That means in binary: 000111 10000 00000 -> jmp IMM NULL

When first if is called, parameter1 should get the value true; but this never happend. Why is that happening? I have not defined correctly IMM value?

Thanks!!!

Upvotes: 1

Views: 75

Answers (2)

Flave
Flave

Reputation: 77

To access the defined values I just re-write them like this:

  #define add 0x1
  #define sub 0x2
  #define jmp 0x7

  #define IMM 0x10
  #define ADDR 0x11

And now it`s working fine.

Upvotes: 0

463035818_is_not_an_ai
463035818_is_not_an_ai

Reputation: 122228

None of your defined numbers are binary numbers

#define add 000001      // octal literals because it starts with 0
#define sub 000010          
#define jmp 000111 

#define IMM 10000       // decimal literals 
#define ADDR 10001

None of them are binary representations as you assume in your code.

Since C++14 you can write a binary literal as (example from https://en.cppreference.com/w/cpp/language/integer_literal):

int b = 0b101010; // C++14

In general I would strongly advise you to not use #define, unless you deliberatly choose to get all the trouble that come with using macros.

Upvotes: 4

Related Questions