Reputation: 920
This is pseudo code for a function that turns decimal digits into binary representations.
The question is Show that Ldiv2[A] for an n-digit number is O(n). and determine the running complexity of the algorithm
The input is a decimal representation of a number X, give by an array of digits A[n-1], …,
The following algorithm uses a “long division by two” procedure Ldiv2 that divides a decimal number by 2. The binary conversion algorithm below convert the array of decimal digits A[0..n-1] to the array of bits B[0, ..4n-1] as follows:
Initialize B[0, ..4n-1] array of bits,
For i = 0 to 4n-1 do:
Begin
B[i]= A[0] %2; // % is the mod;
A = Ldiv2[A];
End;
Return B (possibly removing initial 0’s)
So for the above example X=169, n=2, B[0] = A[0]%2 = 9%2=1, then A=Ldiv2[A] = 84, B[1]=A[0]%2 = 4%2=0, etc.
for Ldiv2[A] i put that 4n-1 for n > 1 so that by definition should be O(n) and for the running complexity of the algorithm i put it was O(n) too because it only has one for loop running from 0 to 4n -1 although bit unclear if there is proof for that.
Upvotes: 2
Views: 954
Reputation: 53535
We run in a loop 4n-1
times and each time preform an action that takes in the beginning O(n)
and O(1)
at the end (when A turns to 1).
So we get:
(4n-1)*(n/log(n)) = O(n^2/log(n))
Upvotes: 1