Timotheus
Timotheus

Reputation: 2340

Hashmap vs Array performance

Is it (performance-wise) better to use Arrays or HashMaps when the indexes of the Array are known? Keep in mind that the 'objects array/map' in the example is just an example, in my real project it is generated by another class so I cant use individual variables.

ArrayExample:

SomeObject[] objects = new SomeObject[2];
objects[0] = new SomeObject("Obj1");
objects[1] = new SomeObject("Obj2");

void doSomethingToObject(String Identifier){
    SomeObject object;
    if(Identifier.equals("Obj1")){
        object=objects[0];
    }else if(){
        object=objects[1];
    }
    //do stuff
}

HashMapExample:

HashMap objects = HashMap();
objects.put("Obj1",new SomeObject());
objects.put("Obj2",new SomeObject());

void doSomethingToObject(String Identifier){
    SomeObject object = (SomeObject) objects.get(Identifier);
    //do stuff
}

The HashMap one looks much much better but I really need performance on this so that has priority.

EDIT: Well Array's it is then, suggestions are still welcome

EDIT: I forgot to mention, the size of the Array/HashMap is always the same (6)

EDIT: It appears that HashMaps are faster Array: 128ms Hash: 103ms

When using less cycles the HashMaps was even twice as fast

test code:

import java.util.HashMap;
import java.util.Random;

public class Optimizationsest {
private static Random r = new Random();

private static HashMap<String,SomeObject> hm = new HashMap<String,SomeObject>();
private static SomeObject[] o = new SomeObject[6];

private static String[] Indentifiers = {"Obj1","Obj2","Obj3","Obj4","Obj5","Obj6"};

private static int t = 1000000;

public static void main(String[] args){
    CreateHash();
    CreateArray();
    long loopTime = ProcessArray();
    long hashTime = ProcessHash();
    System.out.println("Array: " + loopTime + "ms");
    System.out.println("Hash: " + hashTime + "ms");
}

public static void CreateHash(){
    for(int i=0; i <= 5; i++){
        hm.put("Obj"+(i+1), new SomeObject());
    }
}

public static void CreateArray(){
    for(int i=0; i <= 5; i++){
        o[i]=new SomeObject();
    }
}

public static long ProcessArray(){
    StopWatch sw = new StopWatch();
    sw.start();
    for(int i = 1;i<=t;i++){
        checkArray(Indentifiers[r.nextInt(6)]);
    }
    sw.stop();
    return sw.getElapsedTime();
}



private static void checkArray(String Identifier) {
    SomeObject object;
    if(Identifier.equals("Obj1")){
        object=o[0];
    }else if(Identifier.equals("Obj2")){
        object=o[1];
    }else if(Identifier.equals("Obj3")){
        object=o[2];
    }else if(Identifier.equals("Obj4")){
        object=o[3];
    }else if(Identifier.equals("Obj5")){
        object=o[4];
    }else if(Identifier.equals("Obj6")){
        object=o[5];
    }else{
        object = new SomeObject();
    }
    object.kill();
}

public static long ProcessHash(){
    StopWatch sw = new StopWatch();
    sw.start();
    for(int i = 1;i<=t;i++){
        checkHash(Indentifiers[r.nextInt(6)]);
    }
    sw.stop();
    return sw.getElapsedTime();
}

private static void checkHash(String Identifier) {
    SomeObject object = (SomeObject) hm.get(Identifier);
    object.kill();
}

}

Upvotes: 37

Views: 89731

Answers (7)

user22507980
user22507980

Reputation: 1

I wonder no one tell about performance of Hashmap's. Don't forget the fact a hash-function called each time one put data on Hashmap and getting data from a hashmap behind the scene. Each function call uses stack. And even hash function is not simple one primitive operation. The smarter hash-function the more complex code it implements. The same time using trees behind the scene takes a time and a code for pasting data and rearranging hives. Take note no one answered the question when using a hashtable become more efficient over simple array.

Upvotes: 0

Peter Lawrey
Peter Lawrey

Reputation: 533880

HashMap uses an array underneath so it can never be faster than using an array correctly.

Random.nextInt() is many times slower than what you are testing, even using array to test an array is going to bias your results.

The reason your array benchmark is so slow is due to the equals comparisons, not the array access itself.

HashTable is usually much slower than HashMap because it does much the same thing but is also synchronized.

A common problem with micro-benchmarks is the JIT which is very good at removing code which doesn't do anything. If you are not careful you will only be testing whether you have confused the JIT enough that it cannot workout your code doesn't do anything.

This is one of the reason you can write micro-benchmarks which out perform C++ systems. This is because Java is a simpler language and easier to reason about and thus detect code which does nothing useful. This can lead to tests which show that Java does "nothing useful" much faster than C++ ;)

Upvotes: 46

ratchet freak
ratchet freak

Reputation: 48226

arrays when the indexes are know are faster (HashMap uses an array of linked lists behind the scenes which adds a bit of overhead above the array accesses not to mention the hashing operations that need to be done)

and FYI HashMap<String,SomeObject> objects = HashMap<String,SomeObject>(); makes it so you won't have to cast

Upvotes: 5

BlueGuitar
BlueGuitar

Reputation: 41

The example is strange. The key problem is whether your data is dynamic. If it is, you could not write you program that way (as in the array case). In order words, comparing between your array and hash implementation is not fair. The hash implementation works for dynamic data, but the array implementation does not.

If you only have static data (6 fixed objects), array or hash just work as data holder. You could even define static objects.

Upvotes: 0

Alex Gitelman
Alex Gitelman

Reputation: 24732

Logically, HashMap is definitely a fit in your case. From performance standpoint is also wins since in case of arrays you will need to do number of string comparisons (in your algorithm) while in HashMap you just use a hash code if load factor is not too high. Both array and HashMap will need to be resized if you add many elements, but in case of HashMap you will need to also redistribute elements. In this use case HashMap loses.

Upvotes: 2

Basanth Roy
Basanth Roy

Reputation: 6500

Arrays will usually be faster than Collections classes.

PS. You mentioned HashTable in your post. HashTable has even worse performance thatn HashMap. I assume your mention of HashTable was a typo

"The HashTable one looks much much better "

Upvotes: 0

sehe
sehe

Reputation: 393924

For the example shown, HashTable wins, I believe. The problem with the array approach is that it doesn't scale. I imagine you want to have more than two entries in the table, and the condition branch tree in doSomethingToObject will quickly get unwieldly and slow.

Upvotes: 2

Related Questions