Reputation: 9
I'd like to ask a question that I tried to answer myself but couldn't come up with any solution.
I'd like to know of any algorithm (or if it's possible at least to prove whether one does or doesn't exist), with these properties
+-----------+
status_in --> | ALGORITHM | --> status_out
+-----------+
Sorry in advance if the question is not well formed and maybe lacks some important details, but those are basically the only two properties that I'm interested in and I'm not able to rephrase the question more precisely.
Thank you in advance for any help, and please let me know if there are any more details that I could give to make my question more clear.
Upvotes: 0
Views: 82
Reputation: 112284
I will ignore the use of the word "random" in your question, and assume that you are asking about a deterministic algorithm. Therefore when you say 50%, I will take that to mean exactly half of the possible inputs are one bit shorter, and exactly half are one-bit longer. I am also assuming that you know by some means the length in bits of the input and output messages, so that codes can share common prefixes. I.e. the output string of bits does not have to be self-terminating. Also I will assume that you do not need to code a zero-bit-length input. There is at least one bit of input. However a zero-bit output is allowed.
The answer is, yes, if status_in
is a single fixed numbers of bits. The answer is no if status_in
can be any number of bits.
So if status_in
is k bits, then half of them can be encoded in k-1 bits, using all of the possible k-1 bit strings. The other half can be encoded in one-fourth of the available k+1 bit strings. This is reversible.
However if status_in
can be any number of bits, we run into a conflict between the mappings of k and k+2 bits. To encode k bits, we use one-fourth of the k+1 bit values. However to encode k+2 bits, we use all of the k+1 bit values. So there are not enough k+1 bit values. This is first seen when coding 1 bit and 3 bits. One of the 1-bit values codes to zero bits, and the other codes to 2 bits. However when coding 3 bits, four of the 3-bit values codes to all of the possible 2-bit values, but we already used one for one of the 1-bit values. So it can't be done.
Upvotes: 0
Reputation: 49311
Going from left to right you are erasing a bit of information half the time, so by Landauer's principle you the left to right process causes an entropy increase in the system's environment.
For the process to be reversible, recovering 'status_in' from 'status_out', you need to have a means of pulling entropy from the environment, i.e. a perpetual motion machine.
Upvotes: 0
Reputation: 178421
status_in
are used (if there are k
bits in
status_in
, then it can have 2^k
different values, each different
from each other):
status_out
has k-1
bits, and thus maximum number of values possible for status_out
is 2^(k-1)
. Since 2^k > 2^(k-1)
, that means there is some x,y
(of status_in
) such that f(x) = f(y)
. However, given f(x)
, you cannot tell which is the original: x
or y
.status_in
is not including all possibilities, then yes. Take for example a 32 bits signed integer (int
in most languages), that due to some other restrictions, have to be possible. You can remove the sign bit (which is always 0), and get a 31 bits number. Since you know the source is always possible, adding the 0
back is easy.Upvotes: 1