thetna
thetna

Reputation: 7143

ConcurrentHashMap: Can we trust on it?

From the document of ConcurrentHashMap:

A hash table supporting full concurrency of retrievals and adjustable expected concurrency for updates.

Can we fully believe that ConcurrentHashMap does thread safe operation?

I am using ConcurrentHashMap for mapping key with their values.My key-value pair is:

Map<Integer,ArrayList<Double>> map1 = new ConcurrentHashMap();

The size of key ranges from [0,1000000]. I have 20 threads which can access/modify value corresponding to a key at a time. This not so frequent but that condition is possible. I am getting an infinity from following method:

Double sum =0.0; 
sum = sum + Math.exp(getScore(contextFeatureVector,entry.getValue())+constant);

contextFeatureVector and entry.getValue()are arraylist associated with a key.

[EDIT]

 constant =0.0001

private double getScore(List<Double> featureVector,List<Double>weightVector) throws NullPointerException    
{
    double score =0.0;
    int length = featureVector.size();
    for (int i =0 ; i< length ; i++){
    score = score + (featureVector.get(i)*weightVector.get(i)); 
    }

    return score;
}

Both featureVector<> and weightVector looks like

[-0.005554038592516575, 0.0048966974158881175, -0.05315976588195846, -0.030837804373964654, 0.014483064988148562, -0.018962129117649, -0.015221386014208877, 0.015825702365331477, -0.11363620479662287, 0.00802609847263844, -0.062106636476812194, 0.008108854471293185, -0.03193255218671684, 0.04949650992670292, -0.0545583154094599, -0.04873314092706468, 0.013534731656877033, 0.08433117163682455, 0.050310355477044114, -0.002420513353516017, -0.02708299928442614, -0.023489187394176294, -0.1277699782685597, -0.10071004855129333, 0.08649040730064464, -0.04940329664431305, -0.027481729446035053, -0.0571846057609884, -0.036738550618481455, -0.035608113682344365]

thus the value returned from getScore does not go exceptionally too large. it will be in some thousands.

Upvotes: 0

Views: 250

Answers (4)

Dawood ibn Kareem
Dawood ibn Kareem

Reputation: 79828

The largest number that can be stored in a Java double is approximately exp(709). So if you pass anything larger than 709 into exp(), you should expect to get an arithmetic overflow.

Upvotes: 1

Stephen C
Stephen C

Reputation: 718758

If you call Math.exp(...) on an input that is too large you will get an Infinity. That is the probably cause of your problems ... not some imagined problem with the thread safety.

I suggest that you add some trace code to see what

 getScore(contextFeatureVector, entry.getValue())

is returning when sum becomes an Infinity. Beyond that, I don't think we'll be able to help without seeing more of your code.

Upvotes: 2

Tomasz Nurkiewicz
Tomasz Nurkiewicz

Reputation: 340723

The data structure you use makes me believe there must some bug in your code. Most likely you are fetching the list from map and updating it:

map1.get(42).add(5);

Note that add(5) is not thread-safe as it operates on ordinary ArrayList. You either need thread safe ArrayList or replace(K key, V oldValue, V newValue) method.

If you read carefully through the guarantees ConcurrentHashMap is giving, you can use it effectively.

Upvotes: 3

Peter Lawrey
Peter Lawrey

Reputation: 533492

It is thread safe, but can use it an manner which is not thread safe.

I suspect you haven't investigated the problem enough to determine that there is a bug in a JDK library which has been used for more than a decade.

Upvotes: 3

Related Questions