Jan Ajan
Jan Ajan

Reputation: 1591

Measure double vs BigDecimal

I wrote simple benchmark that test performance of multyplying doubles vs BigDecimal. Is my method correct? I use randomized values because compiler optimized multyplying constants many times (eg Math.PI * Math.E).
But:
- I don't know if generating random numbers inside a test corrupts the result.
- The same for creating new BigDecimal objects inside a test.

I want to test performance of multiplication only (not time used by constructor).

How can it be done?

import java.math.*;
import java.util.*;

public class DoubleVsBigDecimal
{
    public static void main(String[] args)
    {
        Random rnd = new Random();
        long t1, t2, t3;
        double t;

        t1 = System.nanoTime();

        for(int i=0; i<1000000; i++)
        {
            double d1 = rnd.nextDouble();
            double d2 = rnd.nextDouble();
            t = d1 * d2;
        }

        t2 = System.nanoTime();

        for(int i=0; i<1000000; i++)
        {
            BigDecimal bd1 = BigDecimal.valueOf(rnd.nextDouble());
            BigDecimal bd2 = BigDecimal.valueOf(rnd.nextDouble());
            bd1.multiply(bd2);
        }

        t3 = System.nanoTime();

        System.out.println(String.format("%f",(t2-t1)/1e9));
        System.out.println(String.format("%f",(t3-t2)/1e9));
        System.out.println(String.format("%f",(double)(t3-t2)/(double)(t2-t1)));
    }
}

Upvotes: 2

Views: 690

Answers (3)

Robert
Robert

Reputation: 8609

You are not only timing the multiply operation, you are also timing other things.

You need to do something like:

    long time = 0; 
    for(int i=0; i<1000000; i++) {
        double d1 = rnd.nextDouble();
        double d2 = rnd.nextDouble();
        long start = System.nanoTime();
        t = d1 * d2;
        long end = System.nanoTime();
        time += (end-start)
    }
    long meantime = time / 1000000;

then probably calculate the standard error too. Also you will probably need to warm the jvm with some calculations first before you start, otherwise you will get some high values at the start.

Upvotes: 3

user unknown
user unknown

Reputation: 36260

You can generate two collection of 1000 doubles/BigDecimals upfront, and multiply each with each in 2 nested loops:

public static void main(String[] args)
{
    Random rnd = new Random();
    List <Double> dl = new ArrayList <Double> ();
    List <BigDecimal> bdl = new ArrayList <BigDecimal> ();

    for(int i=0; i<1000; i++)
    {
        double d = rnd.nextDouble();
        dl.add (d);
        bdl.add (new BigDecimal (d));
    }

    long t1 = System.nanoTime();
    double t;
    for (double d1 : dl)
            for (double d2 : dl)
                t = d1 * d2;

    long t2 = System.nanoTime();

    for (BigDecimal b1 : bdl)
        for (BigDecimal b2 : bdl)
                b1.multiply (b2);

    long t3 = System.nanoTime();

    System.out.println (String.format ("%f", (t2 - t1) / 1e9));
    System.out.println (String.format ("%f", (t3 - t2) / 1e9));
    System.out.println (String.format ("%f", (double) (t3 - t2) / (double) (t2 - t1)));
} 

The first code produced pretty stable values like this, when repeated:

0,186755
10,970243
58,741445

And my code this different values, but stable too:

0,077177
1,112490
14,414710

The difference in the relation is 1:4, which is pretty much. Not as much, as the relation BigDecimal:double, but well ---

(i386-32, client mode, JRE-1.6, linux, oracle, 2Ghz Centrino single core).

Upvotes: 1

Louis Wasserman
Louis Wasserman

Reputation: 198371

Benchmarking in Java is really weird. For example, the JVM won't actually fully optimize a piece of code until it has already been run many times -- but it's fairer to do the measurement after that optimization, because any production system will be calling this method many times.

There are a bunch of other gotchas with Java benchmarking. Probably the simplest way to avoid them is to use a Java benchmarking tool built by experts, e.g. Caliper.

Upvotes: 3

Related Questions