Reputation: 938
I want to sort the array by his indices.
Look at the code below:
var distances = [530, 0, 24, 335, 274, 591];
console.log(sortIndex(distances));
function sortIndex(toSort) {
return toSort.map(
(e,i) => {return {index: i, value: e};}
).sort(
(a,b) => a.value - b.value
).map(
(e,i) => {return e.index;},[]
);
}
The output is:
[1, 2, 4, 3, 0, 5]
Can someone tell me how to reduce the complexity of this snippet? This algorithm is O(n^3)
and I wonder if you can solve the same problem with a better solution?
Anyone out there? Thanks.
Upvotes: 2
Views: 312
Reputation: 24211
I am actually thinking of having the problem solved using a TreeMap
which holds the keys in ascending order and can store the indexes as the value for the key. In that way, we can have the duplicate values indexed properly and I suppose solving the problem using a TreeMap
should have O(n log n)
complexity.
Here is my working code in Java.
import java.util.*;
public class HelloWorld {
public static void main(String []args){
int arr[] = {530, 0, 24, 335, 274, 591};
int sortedIndexes[] = getSortedIndexes(arr);
printSortedIndexes(sortedIndexes);
}
public static void printSortedIndexes(int[] printArr) {
for (int element : printArr) {
System.out.print(element + " ");
}
}
public static int[] getSortedIndexes (int[] arr) {
TreeMap<Integer, List<Integer>> map = new TreeMap<>();
for (int i = 0; i < arr.length; i++) {
if (map.containsKey(arr[i])) {
List<Integer> indexes = map.get(arr[i]);
indexes.add(i);
} else {
List<Integer> indexes = new ArrayList<>();
indexes.add(i);
map.put(arr[i], indexes);
}
}
// Now get the elements from the LinkedHashMap
int[] result = new int[arr.length];
int counter = 0;
for( Map.Entry<Integer, List<Integer>> entry : map.entrySet()) {
Integer key = entry.getKey();
List<Integer> indexes = entry.getValue();
for (int index : indexes) {
result[counter] = index;
counter++;
}
}
return result;
}
}
The TreeMap
itself has O(log n)
complexity for insertion and lookup. I am taking a map of <Integer, Lis<Integer>>
so that we can store the values from the given array as keys and the indexes of the values as a List
under that key (this is to handle the duplicate cases). If there is no duplicate in the input, we can get rid of the List<Integer>
and just can have the TreeMap<Integer, Integer>
.
I hope that helps.
Upvotes: 1
Reputation: 48357
You could obtain a better performance by creating a copy
of the array and a hash
structure of indexes where the complexity is O(N)
because we're iterating the array only once.
var distances = [530, 0, 24, 335, 274, 591];
var copy = Object.assign([], distances);
var indexes = distances.reduce((hash, elem, i) => {
hash[elem] = i;
return hash;
}, {});
sortIndex = (toSort) => toSort.sort().map(el => indexes[el]);
console.log(sortIndex(copy));
So, the final complexity of the algorithm is the complexity of sort
plus the complexity of the map
function.
For the sort
the complexity is Θ(n log(n))
.
For the map
method the complexity is O(n)
because you're iterating the array only once to map every item through a callback
provided function.
Upvotes: 1