Reputation: 19905
I have a Java application that uses the compareTo()
method of the BigDecimal
class in order to classify a (big) real number, read as a string, according to its type (basically, "too big", double or float). The application reads a very large number of such strings per second, so any performance optimization is essential.
Following is an abbreviated excerpt of the code:
static final BigDecimal MAX_LONG = new BigDecimal(Long.MAX_VALUE);
static final BigDecimal MAX_FLOAT = new BigDecimal(Float.MAX_VALUE);
static final BigDecimal MAX_DOUBLE = new BigDecimal(Double.MAX_VALUE);
String value = readValue(); // Read the number as a string
BigDecimal number = new BigDecimal(value);
if (number.compareTo(MAX_DOUBLE) > 0)
{
...
}
else if (number.compareTo(MAX_FLOAT) > 0)
{
...
}
else if (number.compareTo(MAX_LONG) > 0)
{
...
}
So, 2 questions
Upvotes: 2
Views: 1430
Reputation: 46960
I agree with those who questioned whether comparison really is a bottleneck. File or network IO time is more likely.
If comparisons really are a bottleneck and you make an IID or similar assumption about the data, then you'll need fewer comparisons if you keep a histogram counting the inputs that fall in each interval and reorder the tests on the fly so that the most frequent case is verified first.
For example, your current ladder of comparisons is best if there are many numbers greater than MAX_DOUBLE
. Only one comparison per number is needed. But it's worst if most numbers are less or equal to than MAX_FLOAT
, since then three comparisons per number will be needed.
Upvotes: 1
Reputation: 310915
As BigDecimal is immutable it is also therefore thread-safe.
You should also use BigDecimal.valueOf()
instead of new BigDecimal()
throughout, to take advantage of any caching that may be possible.
Upvotes: 6