Reputation: 139
I have three integers (x1,x2,x3)
all in [0,255]
. I need to encode them into two integers (a
, and b
) such that I can deterministically decode them back. The constraint is that the size of the new integers needs to be small.
So I can do a=256*x1+x2
, but this makes a
much larger than xi
.
Any way to encode integers such that the resulting numbers stay small? I am not defining what small is, as I want as small as possible.
A similar problem is to encode these 3
numbers into just 1
. Again the new integer needs to be as small as possible. Any way to do this?
Upvotes: 0
Views: 321
Reputation: 46497
Welcome to information theory / the pigeonhole principle. If you wish to encode x
different values, you need to have enough bits to distinguish between x
different things. In your case there are 256 = 2**8
possibilities (using **
for exponentiation) for each of x1
, x2
, and x3
. Therefore in total there are 2**24
possibilities for the combination. Therefore you will need 2**24
combinations. So 24
bits.
Your first encoding can be achieved using 12 bit numbers in the range 0-4095
. And your encoding can be done as follows (where %
is the remainder operation and //
is integer division, as they are in Python3):
a = (x1%16) * 256 + x2
b = (x1//16) * 256 + x3
with a decoding of:
x1 = (a//256) + (b//256) * 16
x2 = a%256
x3 = b%256
Encoding into 1 number again needs 2**24
possibilities, so that number needs to be in the range 0..16777215
. And the encoding this time is:
c = x1 + 256*x2 + 65536*x3
with a decoding of
x1 = c%256
x2 = (c//256)%256
x3 = c//65536
There are various other encodings/decodings that you can achieve. But they can't be achieved with smaller ranges of numbers than that.
Upvotes: 2