quintin
quintin

Reputation: 837

How elements are grouped in a ConcurrentHashMap based on load factor

In some post I read:

ConcurrentHashMap groups elements by a proximity based on loadfactor

  1. How this grouping happens?

  2. Lets say I override hashCode() function so that it always return 1. Now how are higher and lower values of loadfactor going to effect inserts into a ConcurrentHashMap ?.

  3. Now I override hashCode() function so that it always returns different hashcodes. Now how are higher and lower values of loadfactor going to effect inserts into a ConcurrentHashMap ?.

Upvotes: 1

Views: 195

Answers (1)

JimmyJames
JimmyJames

Reputation: 1403

A hashmap is essentially an array of lists. For example, lets say a given hashmap has an array of 100 lists. When you add something to it, the hashCode is calculated for that object. Then the modulus of that value and the number of lists (in this case 100) is used to determine which list it is added to. So if you add a object with hashcode 13, it gets added to list 13. If you add an object with the hascode 12303512, it get's added to list 12.

The load factor tells the hashmap when to increase the number of lists. It's based on the number of items in the entire map and the current capacity.

In your first scenario where hashcode always returns 1, no matter how many lists there are, your objects will end up in the same list (this is bad.) In the second scenario, they will be distributed more evenly across the lists (this is good.)

Since the load factor is based on the overall size of the map and not that of the lists, the quality of your hashcodes doesn't really interact with the loadfactor. In the first scenario, it will grow just like in the second one but everything will still end up in the same list regardless.

Upvotes: 0

Related Questions