Reputation: 577
Is there any way to distinguish between sorting algorithms from their executable files? I found this problem in a varsity programming mailing list that goes like this: Say I have a number of executable files that sort an array of data using different algorithms. I know what algorithms are used to code those executables, but I don't know which algorithm was used in which executable file. The algorithms used are:
Upvotes: 3
Views: 1524
Reputation: 11
use the following command in CMD you will find the processing time for each codes with which we can order them. echo %time% filename.exe echo %time%
Upvotes: 1
Reputation: 6486
Change the kinds of data and the amount of data you input and compare execution times.
Changing the nature of the data (repeating small numbers (few digits), vs widely distributed data with no duplicates) helps you determine whether a sorting algorithm is comparison-based, (radix/bucket sort vs comparison-based sorts). For example, sorting 1000000 1-digit numbers is super fast with bucket sort since it scales mainly off of the number of digits, but slower for comparison-based sorts that scale mainly off the data set size.
You could also tailor the data to perform better for some algorithms over others, like using best case scenario and worst case scenario for the various algorithms and look for the .exe with the most dramatic change in execution time.
For example, to distinguish between insert sort and selection sort, use an almost sorted result set like (2, 3, ...98, 99, 1
). Insertion sort will do one insert-shift and then the next check will notice that the list is sorted. This will take almost no time. Select sort will have to swap at every index, since the minimum will always be at the final index, and this will take a long time.
Upvotes: 1
Reputation: 420951
You can check their asymptotic behavior by giving them larger and larger input, but many of the listed algorithms fall in the same complexity classes, so you wouldn't be able to distinguish between, say merge sort and quick sort based on this alone.
To break some of these degeneracies you could also look at the memory usage of the different executables, to continue with the merge sort and quick sort example you would see that merge sort would require O(n) additional space while quick sort would only need O(log n) additional space (stack size) to perform the sort.
You might be able to deduce something from giving them degenerate input such as a megabyte of zeros or a megabyte of reversed strings for instance. But you wouldn't be able to do more than educated guesses.
(Excellent comments below. Making this a community wiki, feel free to edit.)
Upvotes: 3