Reputation: 85
In C++, is it possible to sort 1 million numbers, assuming we know the range of the numbers, with using only 100,000 memory cells?
Specifically, a .bin file contains a million numbers within a given range, and need to sort these numbers in descending order into another .bin file, but I am only allowed to use an array of size 100,000 for sorting. Any ideas?
Upvotes: 0
Views: 1238
Reputation: 1184
I think I read it somewhere on SO or Quora about map-reduce:
Divide 1 mil. numbers into 10 blocks. Read in the first block of 100k numbers, sort it using quicksort, then write it back to the original file. Do the same procedure for the remaining 9 blocks. Then perform a 10-way merge on 10 sorted blocks in the original file (you only need 10 cells for this) and write the merged output to another file. You can write to a ~100k buffer then flush it to output file for faster write.
Upvotes: 3
Reputation: 20730
You can implement a version of the quick-sort algorithm that works on files instead than on vectors.
So, recursively split the file in lower-than pivot/higer-than pivot, sort those files, and recombine them. When the size gets under the available memory, just start to work in memory than with files.
Upvotes: 1
Reputation: 726479
Assuming that the range of numbers has 100,000 values or fewer, you can use Counting Sort.
The idea is to use memory cells as counts for the numbers in the range. For example, if the range is 0..99999, inclusive, make an array int count[100000]
, and run through the file incrementing the counts:
count[itemFromFile]++;
Once you went through the whole file, go through the range again. For each count[x]
that is not zero output x
the corresponding number of times. The result will be the original array sorted in ascending order.
Upvotes: 2