Reputation:
I have a problem which requires a string to be transformed into another one by appending copies of its' initial value to itself. The problem allows to remove single characters at some places.
Explanation
let x = "abba"; // First string
let y = "aba" // Second initial string
y("aba") => remove last "a" => y("ab") => y+initialY = "ab"+"aba" =>
y("ababa") => remove char at index 2 => y("abba") => y == x => sucess
My algorithm successfully solves the problem:
let x = "abbbbcccaaac"
let y = "abc"
let xArr = x.split('')
let yArr = y.split('')
let count = 0;
for (let i = 0; i < xArr.length; i++) {
if(yArr[i] == undefined) {
yArr = yArr.concat(y.split(''));
count++;
}
if(xArr[i] != yArr[i]) {
yArr.splice(i, 1);
i--;
}
}
console.log("Output is:", yArr.join(''))
console.log("Appends in order to transform:", count)
The algorithm works as intended, however, I am uncertain regarding its time and space complexity and most importantly - efficiency.
Is this algorithm in O(n)
time complexity where n is the length of x
?
If this is not O(n)
, can the problem be solved in O(n)
time?
Does .concat()
, .splice()
or .split()
somehow change the time complexity since they are nested in a for loop? What if they weren't, do they still change the time complexity of an algorithm and by how much?
Given the rules of this problem, is this an efficient way to solve it?
What is the space complexity of this algorithm?
Upvotes: 8
Views: 2693
Reputation: 51034
Normally a question like this is quite difficult to give a definite answer to, because different implementations of Javascript have different time complexities for basic array operations (such as creating a new array of size n). Javascript arrays will typically be implemented either as dynamic arrays or hashtables, and these data structures have different performance characteristics.
So, there is no definitive time complexity for splice
to remove one element from an array. What we can say is that removing one element takes linear time for a dynamic array, and as @Ry- points out in the comments, also linear time for a hashtable, because of the need to renumber the later indices. We can also say that it's highly likely one of these two data structures is used, and no sensible implementation will take more than linear time to do splice
.
Either way, the worst case for your algorithm is when x = 'aa...aa'
and y = 'abb...bb'
, i.e. x
is n copies of 'a'
, and y
is 'a'
followed by (m - 1) copies of 'b'
.
For a dynamic array or a hashtable, then the time complexity for just the splice
operations is O(nm²). This is because the outer loop iterates O(nm) times (note the i--
inside the loop, which happens every time the letter 'b'
needs to be removed), and the splice
operation requires shifting or renumbering O(m) elements in yArr
after index i
.
But suppose some more exotic data structure is used which supports removing an element in sub-linear time (e.g. a skip list). In that case, the above only gives O(nm) times the complexity of the "remove" operation. But we haven't counted concat
yet; this creates a new data structure and copies every item into it, which will still take linear time. concat
is called O(n) times and takes an average of O(n + m) time per call, so the complexity of just the concat
operations is O(n² + nm).
So the time complexity is very likely O(n² + nm²), and certainly at least O(n² + nm); not linear.
The space complexity is O(n), since the length of yArr
is never more than twice as long as xArr
.
Upvotes: 3