Lemur
Lemur

Reputation: 2665

rand() gives still the same value

I noticed that while practicing by doing a simple console-based quiz app. When I'm using rand() it gives me the same value several times in a row. The smaller number range, the bigger the problem is.

For example

for (i=0; i<10; i++) {
    x = rand() % 20 + 1;
    cout << x << ", ";
}

Will give me 1, 1, 1, 2, 1, 1, 1, 1, 14, - there are definetely too much ones, right? I usually got from none to 4 odd numbers (rest is just the same, it can also be 11, 11, 11, 4, 11 ...)

Am I doing something wrong? Or rand() is not so random that I thought it is?
(Or is it just some habit from C#/Java that I'm not aware of? It happens a lot to me, too...)

Upvotes: 3

Views: 7294

Answers (4)

Jeff Wolski
Jeff Wolski

Reputation: 6372

The likely problems are that you are using the same "random" numbers each time and that any int mod 1 is zero. In other words (myInt % 1 == 0) is always true. Instead of %1, use % theBiggestNumberDesired.

Also, seed your random numbers with srand. Use a constant seed to verify that you are getting good results. Then change the seed to make sure you are still getting good results. Then use a more random seed like the clock to teat further. Release with the random seed.

Upvotes: 0

Tomas Aschan
Tomas Aschan

Reputation: 60584

If I run that code a couple of times, I get different output. Sure, not as varied as I'd like, but seemingly not deterministic (although of course it is, since rand() only gives pseudo-random numbers...).

However, the way you treat your numbers isn't going to give you a uniform distribution over [1,20], which I guess is what you expect. To achieve that is rather more complicated, but in no way impossible. For an example, take a look at the documentation for <random> at cplusplus.com - at the bottom there's a showcase program that generates a uniform distribution over [0,1). To get that to [1,20), you simply change the input parameters to the generator - it can give you a uniform distribution over any range you like.

I did a quick test, and called rand() one million times. As you can see in the output below, even at very large sample sizes, there are some nonuniformities in the distribution. As the number of samples goes to infinity, the line will (probably) flatten out, using something like rand() % 20 + 1 gives you a distribution that takes very long time to do so. If you take something else (like the example above) your chances are better at achieving a uniform distribution even for quite small sample sizes.

1 million calls to rand()

Edit:
I see several others posting about using srand() to seed the random number generator before using it. This is good advice, but it won't solve your problem in this case. I repeat: seeding is not the problem in this case.

Seeds are mainly used to control the reproducibility of the output of your program. If you seed your random number with a constant value (e.g. 0), the program will give the same output every time, which is useful for testing that everything works the way it should. By seeding with something non-constant (the current time is a popular choice) you ensure that the results vary between different runs of the program.

Not calling srand() at all is the same as calling srand(1), by the C++ standard. Thus, you'll get the same results every time you run the program, but you'll have a perfectly valid series of pseudo-random numbers within each run.

Upvotes: 4

Johnny Mnemonic
Johnny Mnemonic

Reputation: 3912

You need to call srand() first and give it the time for parameter for better pseudorandom values.

Example:

#include <iostream>
#include <string>
#include <vector>
#include "stdlib.h"
#include "time.h"

using namespace std;

int main()
{
    srand(time(0));
    int x,i;
    for (i=0; i<10; i++) {
        x = rand() % 20 + 1;
        cout << x << ", ";
    }
    system("pause");
    return 0;
}

If you don't want any of the generated numbers to repeat and memory isn't a concern you can use a vector of ints, shuffle it randomly and then get the values of the first N ints.

Example:

#include <iostream>
#include <vector>
#include <algorithm>

using namespace std;

int main()
{
    //Get 5 random numbers between 1 and 20
    vector<int> v;
    for(int i=1; i<=20; i++)
        v.push_back(i);
    random_shuffle(v.begin(),v.end());
    for(int i=0; i<5; i++)
        cout << v[i] << endl;
    system("pause");
    return 0;
}

Upvotes: 2

Jack Aidley
Jack Aidley

Reputation: 20107

Sounds like you're hitting modulo bias.

Scaling your random numbers to a range by using % is not a good idea. It's just about passable if your reducing it to a range that is a power of 2, but still pretty poor. It is primarily influenced by the smaller bits which are frequently less random with many algorithms (and rand() in particular), and it contracts to the smaller range in a non-uniform fashion because the range your reducing to will not equally divide the range of your random number generator. To reduce the range you should be using a division and loop, like so:

// generate a number from 0 to range-1
int divisor = MAX_RAND/(range+1);
int result;
do
{
    result = rand()/divisor;
} while (result >= range);

This is not as inefficient as it looks because the loop is nearly always passed through only once. Also if you're ever going to use your generator for numbers that approach MAX_RAND you'll need a more complex equation for divisor which I can't remember off-hand.

Also, rand() is a very poor random number generator, consider using something like a Mersenne Twister if you care about the quality of your results.

Upvotes: 3

Related Questions