JC Grubbs
JC Grubbs

Reputation: 40311

Select N random elements from a List<T> in C#

I need a quick algorithm to select 5 random elements from a generic list. For example, I'd like to get 5 random elements from a List<string>.

Upvotes: 216

Views: 176197

Answers (30)

Mushroomator
Mushroomator

Reputation: 9198

>= .NET 8: GetItems<T>()

With .NET 8 we get a built-in way to solve this problem by using one of the following methods

The function takes an Array<T> or Span<T> and returns an array of choosen elements or writes to a destination span (see overrides).

var strings = new List<string>
{
    "a",
    "b",
    "c"
};
var numberOfStringsToSelect = 2;
var randomStrings = Random.Shared.GetItems<string>(CollectionsMarshal.AsSpan(strings), numberOfStringsToSelect);
Console.WriteLine(string.Join(", ", randomStrings));

I am using CollectionsMarshal.AsSpan() to create a Span<T> from the List<T>. Using this method you should not add/ remove any items from the list. Alternatively you can turn the list into an array and pass that array to GetItems<T>().

Upvotes: 5

Anno
Anno

Reputation: 861

// Can include duplicates.
List<string> GetRandomElements(List<string> yourList, int numberOfElements){
  var random = new Random();
  var randomList = new List<string>(yourList.Count); // Allocate memory for significant speedup.

  for (int i = 0; i < numberOfElements; i++){
    var randomIndex = random.Next(0, yourList.Count)
    var randomElement = yourList[randomIndex];
    randomList.Add(randomElement);
  }

  return randomList;
}

Upvotes: 0

user6574009
user6574009

Reputation: 59

public static IEnumerable<Element> GetRandomElements(this IList<Element> list, int n)
{
    var count = list.Count();
    if (count < n)
    {
        throw new Exception("n cannot be bigger than the list size.");
    }
    var indexes = new HashSet<int>();
    while (set.Count < n)
    {
        indexes.Add(Random.Next(count));
    }
    return indexes.Select(x => list[x]);
}

I use this as reference : Performance of Arrays vs. Lists

The implementation is ok because the list is fast enough to get an element by id.

"Random" is define outside of the method scope Hashset ensure the unicity of each index

Limits : the algorithm work better if the list is big and n small. Else it could be that due to collision, the while loop takes a lot of time

In this case using

public static IEnumerable<Element> GetRandomElements(this IList<Element> list, int n)
{
    return list.OrderBy(x => Random.Next()).Take(n);
}

could be an available option

Upvotes: -1

Lesmian
Lesmian

Reputation: 3952

I'd like to share my method. Reading other answers I was wondering if we really need to keep track of chosen items to uphold uniqueness of the results. Usually it slows down the algorithm because you need to repeat the draw if you happen to choose the same item again. So I came up with something different. If you don't care about modifying the input list you can shuffle the items in one go so that chosen items end up at the beginning of the list.

So in each iteration you choose an item and then you switch it to the front of the list. As a result you end up with random items at the start of the input list. The downside of this is that the input list order was modified, but you don't need to repeat the drawing, the results are unique. No need of any additional memory allocation etc. And it works really quick even for edge cases like selecting all items from the list at random.

Here is the code:

public IEnumerable<T> Random_Switch<T>(IList<T> list, int needed)
{
    for (int i = 0; i < needed; i++)
    {
        var index = _rnd.Next(i, list.Count);
        var item = list[index];
        list[index] = list[i];
        list[i] = item;
    }
    
    return list.Take(needed);
}

I also did some benchmarks benefiting from @Leaky answer and here are the results:

|             Method | ListSize | SelectionSize |        Mean |       Error |      StdDev |      Median |   Gen0 | Allocated |
|------------------- |--------- |-------------- |------------:|------------:|------------:|------------:|-------:|----------:|
| Test_IterateSelect |       50 |             5 |    662.2 ns |    13.19 ns |    27.54 ns |    660.9 ns | 0.0477 |     200 B |
| Test_RandomIndices |       50 |             5 |    256.6 ns |     5.12 ns |    12.86 ns |    254.0 ns | 0.0992 |     416 B |
|   Test_FisherYates |       50 |             5 |    405.4 ns |     8.05 ns |    17.33 ns |    401.7 ns | 0.1407 |     590 B |
|  Test_RandomSwitch |       50 |             5 |    152.8 ns |     2.91 ns |     4.87 ns |    153.4 ns | 0.0305 |     128 B |

| Test_IterateSelect |       50 |            10 |    853.8 ns |    17.07 ns |    29.44 ns |    853.9 ns | 0.0687 |     288 B |
| Test_RandomIndices |       50 |            10 |    530.8 ns |    10.63 ns |    28.93 ns |    523.7 ns | 0.1812 |     760 B |
|   Test_FisherYates |       50 |            10 |    862.8 ns |    17.09 ns |    38.92 ns |    859.2 ns | 0.2527 |    1057 B |
|  Test_RandomSwitch |       50 |            10 |    267.4 ns |     5.28 ns |    13.81 ns |    266.4 ns | 0.0343 |     144 B |

| Test_IterateSelect |       50 |            25 |  1,195.6 ns |    23.58 ns |    46.54 ns |  1,199.1 ns | 0.1049 |     440 B |
| Test_RandomIndices |       50 |            25 |  1,455.8 ns |    28.81 ns |    58.20 ns |  1,444.0 ns | 0.3510 |    1472 B |
|   Test_FisherYates |       50 |            25 |  2,066.7 ns |    41.35 ns |    85.40 ns |  2,049.0 ns | 0.4463 |    1869 B |
|  Test_RandomSwitch |       50 |            25 |    610.0 ns |    11.90 ns |    20.83 ns |    610.5 ns | 0.0496 |     208 B |

| Test_IterateSelect |       50 |            50 |  1,436.7 ns |    28.51 ns |    61.37 ns |  1,430.1 ns | 0.1717 |     720 B |
| Test_RandomIndices |       50 |            50 |  6,478.1 ns |   122.70 ns |   247.86 ns |  6,488.7 ns | 0.7248 |    3048 B |
|   Test_FisherYates |       50 |            50 |  3,428.5 ns |    68.49 ns |   118.15 ns |  3,424.5 ns | 0.5455 |    2296 B |
|  Test_RandomSwitch |       50 |            50 |  1,186.8 ns |    23.38 ns |    48.81 ns |  1,179.4 ns | 0.0725 |     304 B |

| Test_IterateSelect |      500 |             5 |  4,374.6 ns |    80.43 ns |   107.37 ns |  4,362.9 ns | 0.0458 |     200 B |
| Test_RandomIndices |      500 |             5 |    252.3 ns |     5.05 ns |    13.21 ns |    251.3 ns | 0.0992 |     416 B |
|   Test_FisherYates |      500 |             5 |    398.0 ns |     7.97 ns |    18.48 ns |    399.3 ns | 0.1411 |     592 B |
|  Test_RandomSwitch |      500 |             5 |    155.4 ns |     3.10 ns |     7.24 ns |    155.0 ns | 0.0305 |     128 B |

| Test_IterateSelect |      500 |            10 |  4,950.1 ns |    96.72 ns |   150.58 ns |  4,942.7 ns | 0.0687 |     288 B |
| Test_RandomIndices |      500 |            10 |    490.0 ns |     9.70 ns |    20.66 ns |    490.6 ns | 0.1812 |     760 B |
|   Test_FisherYates |      500 |            10 |    805.2 ns |    15.70 ns |    20.96 ns |    808.2 ns | 0.2556 |    1072 B |
|  Test_RandomSwitch |      500 |            10 |    254.1 ns |     5.09 ns |    13.31 ns |    253.6 ns | 0.0343 |     144 B |

| Test_IterateSelect |      500 |            25 |  5,785.1 ns |   115.19 ns |   201.74 ns |  5,800.2 ns | 0.0992 |     440 B |
| Test_RandomIndices |      500 |            25 |  1,123.6 ns |    22.31 ns |    53.03 ns |  1,119.6 ns | 0.3510 |    1472 B |
|   Test_FisherYates |      500 |            25 |  1,959.1 ns |    38.82 ns |    91.51 ns |  1,971.1 ns | 0.4807 |    2016 B |
|  Test_RandomSwitch |      500 |            25 |    601.1 ns |    11.83 ns |    23.63 ns |    599.8 ns | 0.0496 |     208 B |

| Test_IterateSelect |      500 |            50 |  6,570.5 ns |   127.03 ns |   190.13 ns |  6,599.8 ns | 0.1678 |     720 B |
| Test_RandomIndices |      500 |            50 |  2,199.6 ns |    43.23 ns |    73.41 ns |  2,198.6 ns | 0.7286 |    3048 B |
|   Test_FisherYates |      500 |            50 |  3,830.0 ns |    76.33 ns |   159.33 ns |  3,809.9 ns | 0.9842 |    4128 B |
|  Test_RandomSwitch |      500 |            50 |  1,150.7 ns |    22.60 ns |    34.52 ns |  1,156.7 ns | 0.0725 |     304 B |

| Test_IterateSelect |     5000 |             5 | 42,833.1 ns |   779.35 ns | 1,463.80 ns | 42,758.9 ns |      - |     200 B |
| Test_RandomIndices |     5000 |             5 |    248.9 ns |     4.95 ns |     9.29 ns |    248.8 ns | 0.0992 |     416 B |
|   Test_FisherYates |     5000 |             5 |    388.9 ns |     7.79 ns |    17.90 ns |    387.0 ns | 0.1411 |     592 B |
|  Test_RandomSwitch |     5000 |             5 |    153.8 ns |     3.10 ns |     6.41 ns |    154.7 ns | 0.0305 |     128 B |

| Test_IterateSelect |     5000 |            10 | 46,814.2 ns |   914.35 ns | 1,311.33 ns | 46,822.7 ns | 0.0610 |     288 B |
| Test_RandomIndices |     5000 |            10 |    498.9 ns |    10.01 ns |    28.56 ns |    491.1 ns | 0.1812 |     760 B |
|   Test_FisherYates |     5000 |            10 |    800.1 ns |    14.44 ns |    29.83 ns |    796.3 ns | 0.2556 |    1072 B |
|  Test_RandomSwitch |     5000 |            10 |    271.6 ns |     5.45 ns |    15.63 ns |    269.2 ns | 0.0343 |     144 B |

| Test_IterateSelect |     5000 |            25 | 50,900.4 ns | 1,000.71 ns | 1,951.81 ns | 51,068.5 ns | 0.0610 |     440 B |
| Test_RandomIndices |     5000 |            25 |  1,112.7 ns |    20.06 ns |    30.63 ns |  1,114.6 ns | 0.3510 |    1472 B |
|   Test_FisherYates |     5000 |            25 |  1,965.9 ns |    38.82 ns |    62.68 ns |  1,953.2 ns | 0.4807 |    2016 B |
|  Test_RandomSwitch |     5000 |            25 |    610.7 ns |    12.23 ns |    20.76 ns |    613.6 ns | 0.0496 |     208 B |

| Test_IterateSelect |     5000 |            50 | 52,062.6 ns | 1,031.59 ns | 1,694.93 ns | 51,882.6 ns | 0.1221 |     720 B |
| Test_RandomIndices |     5000 |            50 |  2,203.7 ns |    43.90 ns |    87.67 ns |  2,197.9 ns | 0.7286 |    3048 B |
|   Test_FisherYates |     5000 |            50 |  3,729.2 ns |    73.08 ns |   124.10 ns |  3,701.8 ns | 0.9842 |    4128 B |
|  Test_RandomSwitch |     5000 |            50 |  1,185.1 ns |    23.29 ns |    39.54 ns |  1,186.5 ns | 0.0725 |     304 B |

Also I guess if you really need to keep the input list unmodified you could store the indices that were switched and revert the order before returning from the function, but that of course would cause additional allocations.

Upvotes: 1

Cardinal
Cardinal

Reputation: 2175

public static IEnumerable<T> GetRandom<T>(IList<T> list, int count, Random random)
    {
        // Probably you should throw exception if count > list.Count
        count = Math.Min(list.Count, count);

        var selectedIndices = new SortedSet<int>();

        // Random upper bound (exclusive)
        int randomMax = list.Count;

        while (selectedIndices.Count < count)
        {
            int randomIndex = random.Next(0, randomMax);

            // skip over already selected indices
            foreach (var selectedIndex in selectedIndices)
                if (selectedIndex <= randomIndex)
                    ++randomIndex;
                else
                    break;

            yield return list[randomIndex];

            selectedIndices.Add(randomIndex);
            --randomMax;
        }
    }

Memory: ~count
Complexity: O(count2)

Upvotes: 0

Kuba Szostak
Kuba Szostak

Reputation: 472

public static IEnumerable<TItem> RandomSample<TItem>(this IReadOnlyList<TItem> items, int count) 
{
    if (count < 1 || count > items.Count)
    {
        throw new ArgumentOutOfRangeException(nameof(count));
    }
    List<int> indexes = Enumerable.Range(0, items.Count).ToList();
    int yieldedCount = 0;

    while (yieldedCount < count)
    {
        int i = RandomNumberGenerator.GetInt32(indexes.Count);
        int randomIndex = indexes[i];
        yield return items[randomIndex];

        // indexes.RemoveAt(i);                  // Avoid removing items from the middle of the list
        indexes[i] = indexes[indexes.Count - 1]; // Replace yielded index with the last one
        indexes.RemoveAt(indexes.Count - 1);     
        yieldedCount++;
    }
}

Upvotes: 1

TXNPRS
TXNPRS

Reputation: 9

Short and simple. Hope this helps someone!

if (list.Count > maxListCount)
{
    var rndList = new List<YourEntity>();
    var r = new Random();
    
    while (rndList.Count < maxListCount)
    {
        var addingElement = list[r.Next(list.Count)];

        //element uniqueness checking
        //choose your case
        //if (rndList.Contains(addingElement))
        //if (rndList.Any(p => p.Id == addingElement.Id))
            continue;
    
        rndList.Add(addingElement);
    }
    
    return rndList;
}

Upvotes: 0

Leaky
Leaky

Reputation: 3636

Here is a benchmark of three different methods:

  1. The implementation of the accepted answer from Kyle.
  2. An approach based on random index selection with HashSet duplication filtering, from drzaus.
  3. A more academic approach posted by Jesús López, called Fisher–Yates shuffle.

The testing will consist of benchmarking the performance with multiple different list sizes and selection sizes.

I also included a measurement of the standard deviation of these three methods, i.e. how well distributed the random selection appears to be.

In a nutshell, drzaus's simple solution seems to be the best overall, from these three. The selected answer is great and elegant, but it's not that efficient, given that the time complexity is based on the sample size, not the selection size. Consequently, if you select a small number of items from a long list, it will take orders of magnitude more time. Of course it still performs better than the solutions based on complete reordering.

Curiously enough, this O(n) time complexity issue is true even if you only touch the list when you actually return an item, like I do in my implementation. The only thing I can thing of is that Random.Next() is pretty slow, and that performance benefits if you generate only one random number for each selected item.

And, also interestingly, the StdDev of Kyle's solution was significantly higher comparatively. I have no clue why; maybe the fault is in my implementation.

Sorry for the long code and output that will commence now; but I hope it's somewhat illuminative. Also, if you spot any issues in the tests or implementations, let me know and I'll fix it.

static void Main()
{
    BenchmarkRunner.Run<Benchmarks>();

    new Benchmarks() { ListSize = 100, SelectionSize = 10 }
        .BenchmarkStdDev();
}

[MemoryDiagnoser]
public class Benchmarks
{
    [Params(50, 500, 5000)]
    public int ListSize;

    [Params(5, 10, 25, 50)]
    public int SelectionSize;

    private Random _rnd;
    private List<int> _list;
    private int[] _hits;

    [GlobalSetup]
    public void Setup()
    {
        _rnd = new Random(12345);
        _list = Enumerable.Range(0, ListSize).ToList();
        _hits = new int[ListSize];
    }

    [Benchmark]
    public void Test_IterateSelect()
        => Random_IterateSelect(_list, SelectionSize).ToList();

    [Benchmark]
    public void Test_RandomIndices() 
        => Random_RandomIdices(_list, SelectionSize).ToList();

    [Benchmark]
    public void Test_FisherYates() 
        => Random_FisherYates(_list, SelectionSize).ToList();

    public void BenchmarkStdDev()
    {
        RunOnce(Random_IterateSelect, "IterateSelect");
        RunOnce(Random_RandomIdices, "RandomIndices");
        RunOnce(Random_FisherYates, "FisherYates");

        void RunOnce(Func<IEnumerable<int>, int, IEnumerable<int>> method, string methodName)
        {
            Setup();
            for (int i = 0; i < 1000000; i++)
            {
                var selected = method(_list, SelectionSize).ToList();
                Debug.Assert(selected.Count() == SelectionSize);
                foreach (var item in selected) _hits[item]++;
            }
            var stdDev = GetStdDev(_hits);
            Console.WriteLine($"StdDev of {methodName}: {stdDev :n} (% of average: {stdDev / (_hits.Average() / 100) :n})");
        }

        double GetStdDev(IEnumerable<int> hits)
        {
            var average = hits.Average();
            return Math.Sqrt(hits.Average(v => Math.Pow(v - average, 2)));
        }
    }

    public IEnumerable<T> Random_IterateSelect<T>(IEnumerable<T> collection, int needed)
    {
        var count = collection.Count();
        for (int i = 0; i < count; i++)
        {
            if (_rnd.Next(count - i) < needed)
            {
                yield return collection.ElementAt(i);
                if (--needed == 0)
                    yield break;
            }
        }
    }

    public IEnumerable<T> Random_RandomIdices<T>(IEnumerable<T> list, int needed)
    {
        var selectedItems = new HashSet<T>();
        var count = list.Count();

        while (needed > 0)
            if (selectedItems.Add(list.ElementAt(_rnd.Next(count))))
                needed--;

        return selectedItems;
    }

    public IEnumerable<T> Random_FisherYates<T>(IEnumerable<T> list, int sampleSize)
    {
        var count = list.Count();
        if (sampleSize > count) throw new ArgumentException("sampleSize may not be greater than list count", "sampleSize");
        var indices = new Dictionary<int, int>(); int index;

        for (int i = 0; i < sampleSize; i++)
        {
            int j = _rnd.Next(i, count);
            if (!indices.TryGetValue(j, out index)) index = j;

            yield return list.ElementAt(index);

            if (!indices.TryGetValue(i, out index)) index = i;
            indices[j] = index;
        }
    }
}

Output:

|        Method | ListSize | Select |        Mean |     Error |    StdDev |  Gen 0 | Allocated |
|-------------- |--------- |------- |------------:|----------:|----------:|-------:|----------:|
| IterateSelect |       50 |      5 |    711.5 ns |   5.19 ns |   4.85 ns | 0.0305 |     144 B |
| RandomIndices |       50 |      5 |    341.1 ns |   4.48 ns |   4.19 ns | 0.0644 |     304 B |
|   FisherYates |       50 |      5 |    573.5 ns |   6.12 ns |   5.72 ns | 0.0944 |     447 B |

| IterateSelect |       50 |     10 |    967.2 ns |   4.64 ns |   3.87 ns | 0.0458 |     220 B |
| RandomIndices |       50 |     10 |    709.9 ns |  11.27 ns |   9.99 ns | 0.1307 |     621 B |
|   FisherYates |       50 |     10 |  1,204.4 ns |  10.63 ns |   9.94 ns | 0.1850 |     875 B |

| IterateSelect |       50 |     25 |  1,358.5 ns |   7.97 ns |   6.65 ns | 0.0763 |     361 B |
| RandomIndices |       50 |     25 |  1,958.1 ns |  15.69 ns |  13.91 ns | 0.2747 |    1298 B |
|   FisherYates |       50 |     25 |  2,878.9 ns |  31.42 ns |  29.39 ns | 0.3471 |    1653 B |

| IterateSelect |       50 |     50 |  1,739.1 ns |  15.86 ns |  14.06 ns | 0.1316 |     629 B |
| RandomIndices |       50 |     50 |  8,906.1 ns |  88.92 ns |  74.25 ns | 0.5951 |    2848 B |
|   FisherYates |       50 |     50 |  4,899.9 ns |  38.10 ns |  33.78 ns | 0.4349 |    2063 B |

| IterateSelect |      500 |      5 |  4,775.3 ns |  46.96 ns |  41.63 ns | 0.0305 |     144 B |
| RandomIndices |      500 |      5 |    327.8 ns |   2.82 ns |   2.50 ns | 0.0644 |     304 B |
|   FisherYates |      500 |      5 |    558.5 ns |   7.95 ns |   7.44 ns | 0.0944 |     449 B |

| IterateSelect |      500 |     10 |  5,387.1 ns |  44.57 ns |  41.69 ns | 0.0458 |     220 B |
| RandomIndices |      500 |     10 |    648.0 ns |   9.12 ns |   8.54 ns | 0.1307 |     621 B |
|   FisherYates |      500 |     10 |  1,154.6 ns |  13.66 ns |  12.78 ns | 0.1869 |     889 B |

| IterateSelect |      500 |     25 |  6,442.3 ns |  48.90 ns |  40.83 ns | 0.0763 |     361 B |
| RandomIndices |      500 |     25 |  1,569.6 ns |  15.79 ns |  14.77 ns | 0.2747 |    1298 B |
|   FisherYates |      500 |     25 |  2,726.1 ns |  25.32 ns |  22.44 ns | 0.3777 |    1795 B |

| IterateSelect |      500 |     50 |  7,775.4 ns |  35.47 ns |  31.45 ns | 0.1221 |     629 B |
| RandomIndices |      500 |     50 |  2,976.9 ns |  27.11 ns |  24.03 ns | 0.6027 |    2848 B |
|   FisherYates |      500 |     50 |  5,383.2 ns |  36.49 ns |  32.35 ns | 0.8163 |    3870 B |

| IterateSelect |     5000 |      5 | 45,208.6 ns | 459.92 ns | 430.21 ns |      - |     144 B |
| RandomIndices |     5000 |      5 |    328.7 ns |   5.15 ns |   4.81 ns | 0.0644 |     304 B |
|   FisherYates |     5000 |      5 |    556.1 ns |  10.75 ns |  10.05 ns | 0.0944 |     449 B |

| IterateSelect |     5000 |     10 | 49,253.9 ns | 420.26 ns | 393.11 ns |      - |     220 B |
| RandomIndices |     5000 |     10 |    642.9 ns |   4.95 ns |   4.13 ns | 0.1307 |     621 B |
|   FisherYates |     5000 |     10 |  1,141.9 ns |  12.81 ns |  11.98 ns | 0.1869 |     889 B |

| IterateSelect |     5000 |     25 | 54,044.4 ns | 208.92 ns | 174.46 ns | 0.0610 |     361 B |
| RandomIndices |     5000 |     25 |  1,480.5 ns |  11.56 ns |  10.81 ns | 0.2747 |    1298 B |
|   FisherYates |     5000 |     25 |  2,713.9 ns |  27.31 ns |  24.21 ns | 0.3777 |    1795 B |

| IterateSelect |     5000 |     50 | 54,418.2 ns | 329.62 ns | 308.32 ns | 0.1221 |     629 B |
| RandomIndices |     5000 |     50 |  2,886.4 ns |  36.53 ns |  34.17 ns | 0.6027 |    2848 B |
|   FisherYates |     5000 |     50 |  5,347.2 ns |  59.45 ns |  55.61 ns | 0.8163 |    3870 B |

StdDev of IterateSelect: 671.88 (% of average: 0.67)
StdDev of RandomIndices: 296.07 (% of average: 0.30)
StdDev of FisherYates: 280.47 (% of average: 0.28)

Upvotes: 2

Kyle Cronin
Kyle Cronin

Reputation: 79113

Iterate through and for each element make the probability of selection = (number needed)/(number left)

So if you had 40 items, the first would have a 5/40 chance of being selected. If it is, the next has a 4/39 chance, otherwise it has a 5/39 chance. By the time you get to the end you will have your 5 items, and often you'll have all of them before that.

This technique is called selection sampling, a special case of Reservoir Sampling. It's similar in performance to shuffling the input, but of course allows the sample to be generated without modifying the original data.

Upvotes: 152

DontPanic345
DontPanic345

Reputation: 418

12 years on and the this question is still active, I didn't find an implementation of Kyle's solution I liked so here it is:

public IEnumerable<T> TakeRandom<T>(IEnumerable<T> collection, int take)
{
    var random = new Random();
    var available = collection.Count();
    var needed = take;
    foreach (var item in collection)
    {
        if (random.Next(available) < needed)
        {
            needed--;
            yield return item;
            if (needed == 0)
            {
                break;
            }
        }
        available--;
    }
}

Upvotes: 12

Cyrille
Cyrille

Reputation: 27

This will solve your issue

var entries=new List<T>();
var selectedItems = new List<T>();


                for (var i = 0; i !=10; i++)
                {
                    var rdm = new Random().Next(entries.Count);
                        while (selectedItems.Contains(entries[rdm]))
                            rdm = new Random().Next(entries.Count);

                    selectedItems.Add(entries[rdm]);
                }

Upvotes: -2

spoulson
spoulson

Reputation: 21591

From Dragons in the Algorithm, an interpretation in C#:

int k = 10; // items to select
var items = new List<int>(new[] { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 });
var selected = new List<int>();
double needed = k;
double available = items.Count;
var rand = new Random();
while (selected.Count < k) {
   if( rand.NextDouble() < needed / available ) {
      selected.Add(items[(int)available-1])
      needed--;
   }
   available--;
}

This algorithm will select unique indicies of the items list.

Upvotes: 9

Jesse Gador
Jesse Gador

Reputation: 85

Goal: Select N number of items from collection source without duplication. I created an extension for any generic collection. Here's how I did it:

public static class CollectionExtension
{
    public static IList<TSource> RandomizeCollection<TSource>(this IList<TSource> source, int maxItems)
    {
        int randomCount = source.Count > maxItems ? maxItems : source.Count;
        int?[] randomizedIndices = new int?[randomCount];
        Random random = new Random();

        for (int i = 0; i < randomizedIndices.Length; i++)
        {
            int randomResult = -1;
            while (randomizedIndices.Contains((randomResult = random.Next(0, source.Count))))
            {
                //0 -> since all list starts from index 0; source.Count -> maximum number of items that can be randomize
                //continue looping while the generated random number is already in the list of randomizedIndices
            }

            randomizedIndices[i] = randomResult;
        }

        IList<TSource> result = new List<TSource>();
        foreach (int index in randomizedIndices)
            result.Add(source.ElementAt(index));

        return result;
    }
}

Upvotes: 0

hardsetting
hardsetting

Reputation: 4160

Extending from @ers's answer, if one is worried about possible different implementations of OrderBy, this should be safe:

// Instead of this
YourList.OrderBy(x => rnd.Next()).Take(5)

// Temporarily transform 
YourList
    .Select(v => new {v, i = rnd.Next()}) // Associate a random index to each entry
    .OrderBy(x => x.i).Take(5) // Sort by (at this point fixed) random index 
    .Select(x => x.v); // Go back to enumerable of entry

Upvotes: 6

Jes&#250;s L&#243;pez
Jes&#250;s L&#243;pez

Reputation: 9221

Here you have one implementation based on Fisher-Yates Shuffle whose algorithm complexity is O(n) where n is the subset or sample size, instead of the list size, as John Shedletsky pointed out.

public static IEnumerable<T> GetRandomSample<T>(this IList<T> list, int sampleSize)
{
    if (list == null) throw new ArgumentNullException("list");
    if (sampleSize > list.Count) throw new ArgumentException("sampleSize may not be greater than list count", "sampleSize");
    var indices = new Dictionary<int, int>(); int index;
    var rnd = new Random();

    for (int i = 0; i < sampleSize; i++)
    {
        int j = rnd.Next(i, list.Count);
        if (!indices.TryGetValue(j, out index)) index = j;

        yield return list[index];

        if (!indices.TryGetValue(i, out index)) index = i;
        indices[j] = index;
    }
}

Upvotes: 6

Wolf5
Wolf5

Reputation: 17140

Using LINQ with large lists (when costly to touch each element) AND if you can live with the possibility of duplicates:

new int[5].Select(o => (int)(rnd.NextDouble() * maxIndex)).Select(i => YourIEnum.ElementAt(i))

For my use i had a list of 100.000 elements, and because of them being pulled from a DB I about halfed (or better) the time compared to a rnd on the whole list.

Having a large list will reduce the odds greatly for duplicates.

Upvotes: 0

Dai
Dai

Reputation: 1

When N is very large, the normal method that randomly shuffles the N numbers and selects, say, first k numbers, can be prohibitive because of space complexity. The following algorithm requires only O(k) for both time and space complexities.

http://arxiv.org/abs/1512.00501

def random_selection_indices(num_samples, N):
    modified_entries = {}
    seq = []
    for n in xrange(num_samples):
        i = N - n - 1
        j = random.randrange(i)

        # swap a[j] and a[i] 
        a_j = modified_entries[j] if j in modified_entries else j 
        a_i = modified_entries[i] if i in modified_entries else i

        if a_i != j:
            modified_entries[j] = a_i   
        elif j in modified_entries:   # no need to store the modified value if it is the same as index
            modified_entries.pop(j)

        if a_j != i:
            modified_entries[i] = a_j 
        elif i in modified_entries:   # no need to store the modified value if it is the same as index
            modified_entries.pop(i)
        seq.append(a_j)
    return seq

Upvotes: -1

Kvam
Kvam

Reputation: 2218

I would use an extension method.

    public static IEnumerable<T> TakeRandom<T>(this IEnumerable<T> elements, int countToTake)
    {
        var random = new Random();

        var internalList = elements.ToList();

        var selected = new List<T>();
        for (var i = 0; i < countToTake; ++i)
        {
            var next = random.Next(0, internalList.Count - selected.Count);
            selected.Add(internalList[next]);
            internalList[next] = internalList[internalList.Count - selected.Count];
        }
        return selected;
    }

Upvotes: 0

Paul Chernoch
Paul Chernoch

Reputation: 5553

I combined several of the above answers to create a Lazily-evaluated extension method. My testing showed that Kyle's approach (Order(N)) is many times slower than drzaus' use of a set to propose the random indices to choose (Order(K)). The former performs many more calls to the random number generator, plus iterates more times over the items.

The goals of my implementation were:

1) Do not realize the full list if given an IEnumerable that is not an IList. If I am given a sequence of a zillion items, I do not want to run out of memory. Use Kyle's approach for an on-line solution.

2) If I can tell that it is an IList, use drzaus' approach, with a twist. If K is more than half of N, I risk thrashing as I choose many random indices again and again and have to skip them. Thus I compose a list of the indices to NOT keep.

3) I guarantee that the items will be returned in the same order that they were encountered. Kyle's algorithm required no alteration. drzaus' algorithm required that I not emit items in the order that the random indices are chosen. I gather all the indices into a SortedSet, then emit items in sorted index order.

4) If K is large compared to N and I invert the sense of the set, then I enumerate all items and test if the index is not in the set. This means that I lose the Order(K) run time, but since K is close to N in these cases, I do not lose much.

Here is the code:

    /// <summary>
    /// Takes k elements from the next n elements at random, preserving their order.
    /// 
    /// If there are fewer than n elements in items, this may return fewer than k elements.
    /// </summary>
    /// <typeparam name="TElem">Type of element in the items collection.</typeparam>
    /// <param name="items">Items to be randomly selected.</param>
    /// <param name="k">Number of items to pick.</param>
    /// <param name="n">Total number of items to choose from.
    /// If the items collection contains more than this number, the extra members will be skipped.
    /// If the items collection contains fewer than this number, it is possible that fewer than k items will be returned.</param>
    /// <returns>Enumerable over the retained items.
    /// 
    /// See http://stackoverflow.com/questions/48087/select-a-random-n-elements-from-listt-in-c-sharp for the commentary.
    /// </returns>
    public static IEnumerable<TElem> TakeRandom<TElem>(this IEnumerable<TElem> items, int k, int n)
    {
        var r = new FastRandom();
        var itemsList = items as IList<TElem>;

        if (k >= n || (itemsList != null && k >= itemsList.Count))
            foreach (var item in items) yield return item;
        else
        {  
            // If we have a list, we can infer more information and choose a better algorithm.
            // When using an IList, this is about 7 times faster (on one benchmark)!
            if (itemsList != null && k < n/2)
            {
                // Since we have a List, we can use an algorithm suitable for Lists.
                // If there are fewer than n elements, reduce n.
                n = Math.Min(n, itemsList.Count);

                // This algorithm picks K index-values randomly and directly chooses those items to be selected.
                // If k is more than half of n, then we will spend a fair amount of time thrashing, picking
                // indices that we have already picked and having to try again.   
                var invertSet = k >= n/2;  
                var positions = invertSet ? (ISet<int>) new HashSet<int>() : (ISet<int>) new SortedSet<int>();

                var numbersNeeded = invertSet ? n - k : k;
                while (numbersNeeded > 0)
                    if (positions.Add(r.Next(0, n))) numbersNeeded--;

                if (invertSet)
                {
                    // positions contains all the indices of elements to Skip.
                    for (var itemIndex = 0; itemIndex < n; itemIndex++)
                    {
                        if (!positions.Contains(itemIndex))
                            yield return itemsList[itemIndex];
                    }
                }
                else
                {
                    // positions contains all the indices of elements to Take.
                    foreach (var itemIndex in positions)
                        yield return itemsList[itemIndex];              
                }
            }
            else
            {
                // Since we do not have a list, we will use an online algorithm.
                // This permits is to skip the rest as soon as we have enough items.
                var found = 0;
                var scanned = 0;
                foreach (var item in items)
                {
                    var rand = r.Next(0,n-scanned);
                    if (rand < k - found)
                    {
                        yield return item;
                        found++;
                    }
                    scanned++;
                    if (found >= k || scanned >= n)
                        break;
                }
            }
        }  
    } 

I use a specialized random number generator, but you can just use C#'s Random if you want. (FastRandom was written by Colin Green and is part of SharpNEAT. It has a period of 2^128-1 which is better than many RNGs.)

Here are the unit tests:

[TestClass]
public class TakeRandomTests
{
    /// <summary>
    /// Ensure that when randomly choosing items from an array, all items are chosen with roughly equal probability.
    /// </summary>
    [TestMethod]
    public void TakeRandom_Array_Uniformity()
    {
        const int numTrials = 2000000;
        const int expectedCount = numTrials/20;
        var timesChosen = new int[100];
        var century = new int[100];
        for (var i = 0; i < century.Length; i++)
            century[i] = i;

        for (var trial = 0; trial < numTrials; trial++)
        {
            foreach (var i in century.TakeRandom(5, 100))
                timesChosen[i]++;
        }
        var avg = timesChosen.Average();
        var max = timesChosen.Max();
        var min = timesChosen.Min();
        var allowedDifference = expectedCount/100;
        AssertBetween(avg, expectedCount - 2, expectedCount + 2, "Average");
        //AssertBetween(min, expectedCount - allowedDifference, expectedCount, "Min");
        //AssertBetween(max, expectedCount, expectedCount + allowedDifference, "Max");

        var countInRange = timesChosen.Count(i => i >= expectedCount - allowedDifference && i <= expectedCount + allowedDifference);
        Assert.IsTrue(countInRange >= 90, String.Format("Not enough were in range: {0}", countInRange));
    }

    /// <summary>
    /// Ensure that when randomly choosing items from an IEnumerable that is not an IList, 
    /// all items are chosen with roughly equal probability.
    /// </summary>
    [TestMethod]
    public void TakeRandom_IEnumerable_Uniformity()
    {
        const int numTrials = 2000000;
        const int expectedCount = numTrials / 20;
        var timesChosen = new int[100];

        for (var trial = 0; trial < numTrials; trial++)
        {
            foreach (var i in Range(0,100).TakeRandom(5, 100))
                timesChosen[i]++;
        }
        var avg = timesChosen.Average();
        var max = timesChosen.Max();
        var min = timesChosen.Min();
        var allowedDifference = expectedCount / 100;
        var countInRange =
            timesChosen.Count(i => i >= expectedCount - allowedDifference && i <= expectedCount + allowedDifference);
        Assert.IsTrue(countInRange >= 90, String.Format("Not enough were in range: {0}", countInRange));
    }

    private IEnumerable<int> Range(int low, int count)
    {
        for (var i = low; i < low + count; i++)
            yield return i;
    }

    private static void AssertBetween(int x, int low, int high, String message)
    {
        Assert.IsTrue(x > low, String.Format("Value {0} is less than lower limit of {1}. {2}", x, low, message));
        Assert.IsTrue(x < high, String.Format("Value {0} is more than upper limit of {1}. {2}", x, high, message));
    }

    private static void AssertBetween(double x, double low, double high, String message)
    {
        Assert.IsTrue(x > low, String.Format("Value {0} is less than lower limit of {1}. {2}", x, low, message));
        Assert.IsTrue(x < high, String.Format("Value {0} is more than upper limit of {1}. {2}", x, high, message));
    }
}

Upvotes: 8

Alex Gilbert
Alex Gilbert

Reputation: 21

This method may be equivalent to Kyle's.

Say your list is of size n and you want k elements.

Random rand = new Random();
for(int i = 0; k>0; ++i) 
{
    int r = rand.Next(0, n-i);
    if(r<k) 
    {
        //include element i
        k--;
    }
} 

Works like a charm :)

-Alex Gilbert

Upvotes: 2

Tom Gullen
Tom Gullen

Reputation: 61725

Based on Kyle's answer, here's my c# implementation.

/// <summary>
/// Picks random selection of available game ID's
/// </summary>
private static List<int> GetRandomGameIDs(int count)
{       
    var gameIDs = (int[])HttpContext.Current.Application["NonDeletedArcadeGameIDs"];
    var totalGameIDs = gameIDs.Count();
    if (count > totalGameIDs) count = totalGameIDs;

    var rnd = new Random();
    var leftToPick = count;
    var itemsLeft = totalGameIDs;
    var arrPickIndex = 0;
    var returnIDs = new List<int>();
    while (leftToPick > 0)
    {
        if (rnd.Next(0, itemsLeft) < leftToPick)
        {
            returnIDs .Add(gameIDs[arrPickIndex]);
            leftToPick--;
        }
        arrPickIndex++;
        itemsLeft--;
    }

    return returnIDs ;
}

Upvotes: 2

drzaus
drzaus

Reputation: 25004

Was thinking about comment by @JohnShedletsky on the accepted answer regarding (paraphrase):

you should be able to to this in O(subset.Length), rather than O(originalList.Length)

Basically, you should be able to generate subset random indices and then pluck them from the original list.

The Method

public static class EnumerableExtensions {

    public static Random randomizer = new Random(); // you'd ideally be able to replace this with whatever makes you comfortable

    public static IEnumerable<T> GetRandom<T>(this IEnumerable<T> list, int numItems) {
        return (list as T[] ?? list.ToArray()).GetRandom(numItems);

        // because ReSharper whined about duplicate enumeration...
        /*
        items.Add(list.ElementAt(randomizer.Next(list.Count()))) ) numItems--;
        */
    }

    // just because the parentheses were getting confusing
    public static IEnumerable<T> GetRandom<T>(this T[] list, int numItems) {
        var items = new HashSet<T>(); // don't want to add the same item twice; otherwise use a list
        while (numItems > 0 )
            // if we successfully added it, move on
            if( items.Add(list[randomizer.Next(list.Length)]) ) numItems--;

        return items;
    }

    // and because it's really fun; note -- you may get repetition
    public static IEnumerable<T> PluckRandomly<T>(this IEnumerable<T> list) {
        while( true )
            yield return list.ElementAt(randomizer.Next(list.Count()));
    }

}

If you wanted to be even more efficient, you would probably use a HashSet of the indices, not the actual list elements (in case you've got complex types or expensive comparisons);

The Unit Test

And to make sure we don't have any collisions, etc.

[TestClass]
public class RandomizingTests : UnitTestBase {
    [TestMethod]
    public void GetRandomFromList() {
        this.testGetRandomFromList((list, num) => list.GetRandom(num));
    }

    [TestMethod]
    public void PluckRandomly() {
        this.testGetRandomFromList((list, num) => list.PluckRandomly().Take(num), requireDistinct:false);
    }

    private void testGetRandomFromList(Func<IEnumerable<int>, int, IEnumerable<int>> methodToGetRandomItems, int numToTake = 10, int repetitions = 100000, bool requireDistinct = true) {
        var items = Enumerable.Range(0, 100);
        IEnumerable<int> randomItems = null;

        while( repetitions-- > 0 ) {
            randomItems = methodToGetRandomItems(items, numToTake);
            Assert.AreEqual(numToTake, randomItems.Count(),
                            "Did not get expected number of items {0}; failed at {1} repetition--", numToTake, repetitions);
            if(requireDistinct) Assert.AreEqual(numToTake, randomItems.Distinct().Count(),
                            "Collisions (non-unique values) found, failed at {0} repetition--", repetitions);
            Assert.IsTrue(randomItems.All(o => items.Contains(o)),
                        "Some unknown values found; failed at {0} repetition--", repetitions);
        }
    }
}

Upvotes: 7

apdnu
apdnu

Reputation: 7315

This isn't as elegant or efficient as the accepted solution, but it's quick to write up. First, permute the array randomly, then select the first K elements. In python,

import numpy

N = 20
K = 5

idx = np.arange(N)
numpy.random.shuffle(idx)

print idx[:K]

Upvotes: 0

nawfal
nawfal

Reputation: 73183

Selecting N random items from a group shouldn't have anything to do with order! Randomness is about unpredictability and not about shuffling positions in a group. All the answers that deal with some kinda ordering is bound to be less efficient than the ones that do not. Since efficiency is the key here, I will post something that doesn't change the order of items too much.

1) If you need true random values which means there is no restriction on which elements to choose from (ie, once chosen item can be reselected):

public static List<T> GetTrueRandom<T>(this IList<T> source, int count, 
                                       bool throwArgumentOutOfRangeException = true)
{
    if (throwArgumentOutOfRangeException && count > source.Count)
        throw new ArgumentOutOfRangeException();

    var randoms = new List<T>(count);
    randoms.AddRandomly(source, count);
    return randoms;
}

If you set the exception flag off, then you can choose random items any number of times.

If you have { 1, 2, 3, 4 }, then it can give { 1, 4, 4 }, { 1, 4, 3 } etc for 3 items or even { 1, 4, 3, 2, 4 } for 5 items!

This should be pretty fast, as it has nothing to check.

2) If you need individual members from the group with no repetition, then I would rely on a dictionary (as many have pointed out already).

public static List<T> GetDistinctRandom<T>(this IList<T> source, int count)
{
    if (count > source.Count)
        throw new ArgumentOutOfRangeException();

    if (count == source.Count)
        return new List<T>(source);

    var sourceDict = source.ToIndexedDictionary();

    if (count > source.Count / 2)
    {
        while (sourceDict.Count > count)
            sourceDict.Remove(source.GetRandomIndex());

        return sourceDict.Select(kvp => kvp.Value).ToList();
    }

    var randomDict = new Dictionary<int, T>(count);
    while (randomDict.Count < count)
    {
        int key = source.GetRandomIndex();
        if (!randomDict.ContainsKey(key))
            randomDict.Add(key, sourceDict[key]);
    }

    return randomDict.Select(kvp => kvp.Value).ToList();
}

The code is a bit lengthier than other dictionary approaches here because I'm not only adding, but also removing from list, so its kinda two loops. You can see here that I have not reordered anything at all when count becomes equal to source.Count. That's because I believe randomness should be in the returned set as a whole. I mean if you want 5 random items from 1, 2, 3, 4, 5, it shouldn't matter if its 1, 3, 4, 2, 5 or 1, 2, 3, 4, 5, but if you need 4 items from the same set, then it should unpredictably yield in 1, 2, 3, 4, 1, 3, 5, 2, 2, 3, 5, 4 etc. Secondly, when the count of random items to be returned is more than half of the original group, then its easier to remove source.Count - count items from the group than adding count items. For performance reasons I have used source instead of sourceDict to get then random index in the remove method.

So if you have { 1, 2, 3, 4 }, this can end up in { 1, 2, 3 }, { 3, 4, 1 } etc for 3 items.

3) If you need truly distinct random values from your group by taking into account the duplicates in the original group, then you may use the same approach as above, but a HashSet will be lighter than a dictionary.

public static List<T> GetTrueDistinctRandom<T>(this IList<T> source, int count, 
                                               bool throwArgumentOutOfRangeException = true)
{
    if (count > source.Count)
        throw new ArgumentOutOfRangeException();

    var set = new HashSet<T>(source);

    if (throwArgumentOutOfRangeException && count > set.Count)
        throw new ArgumentOutOfRangeException();

    List<T> list = hash.ToList();

    if (count >= set.Count)
        return list;

    if (count > set.Count / 2)
    {
        while (set.Count > count)
            set.Remove(list.GetRandom());

        return set.ToList();
    }

    var randoms = new HashSet<T>();
    randoms.AddRandomly(list, count);
    return randoms.ToList();
}

The randoms variable is made a HashSet to avoid duplicates being added in the rarest of rarest cases where Random.Next can yield the same value, especially when input list is small.

So { 1, 2, 2, 4 } => 3 random items => { 1, 2, 4 } and never { 1, 2, 2}

{ 1, 2, 2, 4 } => 4 random items => exception!! or { 1, 2, 4 } depending on the flag set.

Some of the extension methods I have used:

static Random rnd = new Random();
public static int GetRandomIndex<T>(this ICollection<T> source)
{
    return rnd.Next(source.Count);
}

public static T GetRandom<T>(this IList<T> source)
{
    return source[source.GetRandomIndex()];
}

static void AddRandomly<T>(this ICollection<T> toCol, IList<T> fromList, int count)
{
    while (toCol.Count < count)
        toCol.Add(fromList.GetRandom());
}

public static Dictionary<int, T> ToIndexedDictionary<T>(this IEnumerable<T> lst)
{
    return lst.ToIndexedDictionary(t => t);
}

public static Dictionary<int, T> ToIndexedDictionary<S, T>(this IEnumerable<S> lst, 
                                                           Func<S, T> valueSelector)
{
    int index = -1;
    return lst.ToDictionary(t => ++index, valueSelector);
}

If its all about performance with tens of 1000s of items in the list having to be iterated 10000 times, then you may want to have faster random class than System.Random, but I don't think that's a big deal considering the latter most probably is never a bottleneck, its quite fast enough..

Edit: If you need to re-arrange order of returned items as well, then there's nothing that can beat dhakim's Fisher-Yates approach - short, sweet and simple..

Upvotes: 5

vag
vag

Reputation: 589

public static List<T> GetRandomElements<T>(this IEnumerable<T> list, int elementsCount)
{
    return list.OrderBy(arg => Guid.NewGuid()).Take(elementsCount).ToList();
}

Upvotes: 54

Marwan Roushdy
Marwan Roushdy

Reputation: 1230

You can use this but the ordering will happen on client side

 .AsEnumerable().OrderBy(n => Guid.NewGuid()).Take(5);

Upvotes: 8

dhakim
dhakim

Reputation: 508

I just ran into this problem, and some more google searching brought me to the problem of randomly shuffling a list: http://en.wikipedia.org/wiki/Fisher-Yates_shuffle

To completely randomly shuffle your list (in place) you do this:

To shuffle an array a of n elements (indices 0..n-1):

  for i from n − 1 downto 1 do
       j ← random integer with 0 ≤ j ≤ i
       exchange a[j] and a[i]

If you only need the first 5 elements, then instead of running i all the way from n-1 to 1, you only need to run it to n-5 (ie: n-5)

Lets say you need k items,

This becomes:

  for (i = n − 1; i >= n-k; i--)
  {
       j = random integer with 0 ≤ j ≤ i
       exchange a[j] and a[i]
  }

Each item that is selected is swapped toward the end of the array, so the k elements selected are the last k elements of the array.

This takes time O(k), where k is the number of randomly selected elements you need.

Further, if you don't want to modify your initial list, you can write down all your swaps in a temporary list, reverse that list, and apply them again, thus performing the inverse set of swaps and returning you your initial list without changing the O(k) running time.

Finally, for the real stickler, if (n == k), you should stop at 1, not n-k, as the randomly chosen integer will always be 0.

Upvotes: 15

Kristofer
Kristofer

Reputation: 21

Here's my approach (full text here http://krkadev.blogspot.com/2010/08/random-numbers-without-repetition.html ).

It should run in O(K) instead of O(N), where K is the number of wanted elements and N is the size of the list to choose from:

public <T> List<T> take(List<T> source, int k) {
 int n = source.size();
 if (k > n) {
   throw new IllegalStateException(
     "Can not take " + k +
     " elements from a list with " + n +
     " elements");
 }
 List<T> result = new ArrayList<T>(k);
 Map<Integer,Integer> used = new HashMap<Integer,Integer>();
 int metric = 0;
 for (int i = 0; i < k; i++) {
   int off = random.nextInt(n - i);
   while (true) {
     metric++;
     Integer redirect = used.put(off, n - i - 1);
     if (redirect == null) {
       break;
     }
     off = redirect;
   }
   result.add(source.get(off));
 }
 assert metric <= 2*k;
 return result;
}

Upvotes: 0

Frank Schwieterman
Frank Schwieterman

Reputation: 24480

I think the selected answer is correct and pretty sweet. I implemented it differently though, as I also wanted the result in random order.

    static IEnumerable<SomeType> PickSomeInRandomOrder<SomeType>(
        IEnumerable<SomeType> someTypes,
        int maxCount)
    {
        Random random = new Random(DateTime.Now.Millisecond);

        Dictionary<double, SomeType> randomSortTable = new Dictionary<double,SomeType>();

        foreach(SomeType someType in someTypes)
            randomSortTable[random.NextDouble()] = someType;

        return randomSortTable.OrderBy(KVP => KVP.Key).Take(maxCount).Select(KVP => KVP.Value);
    }

Upvotes: 15

Tine M.
Tine M.

Reputation: 444

The simple solution I use (probably not good for large lists): Copy the list into temporary list, then in loop randomly select Item from temp list and put it in selected items list while removing it form temp list (so it can't be reselected).

Example:

List<Object> temp = OriginalList.ToList();
List<Object> selectedItems = new List<Object>();
Random rnd = new Random();
Object o;
int i = 0;
while (i < NumberOfSelectedItems)
{
            o = temp[rnd.Next(temp.Count)];
            selectedItems.Add(o);
            temp.Remove(o);
            i++;
 }

Upvotes: 3

Related Questions