ash-rocks
ash-rocks

Reputation: 41

ParallelStream with Maps

I want to read values from complex map and execute one function in parallel way. For that I am using ForkJoinPool from Java 8.

Issue I am facing here is, for few values, function is executed twice. Initially I thought, Hashmap is not thread safe so I have tried to use HashTable, but its the same...

final Map<CakeOrder, List<CakeOrderDetails>> cakeOrderMap = cakeUpdatesQueueItemDao.getCakeOrdersByStatus(EItemStatus.OPENED, country.getDbTableSuffix(), true);
final Map<CakeOrder, List<CakeOrderDetails>> cakeOrderTableMap = new Hashtable<>();

cakeOrderMap.forEach((k, v) -> cakeOrderTableMap.put(k, v));

final ForkJoinPool pool = new ForkJoinPool(20);
pool.submit(() -> cakeOrderTableMap.entrySet()
   .stream()
   .parallel()
   .forEach(entry -> cakeService.updateCakeOrderStatusAndPrice(entry.getKey(), entry.getValue()))).invoke();

Upvotes: 3

Views: 1132

Answers (1)

Tagir Valeev
Tagir Valeev

Reputation: 100169

For parallel stream the thread-safety of the input collection is unnecessary as long as it's not updated. On the other way, Hashtable.entrySet() is a very bad source for parallel stream as Hashtable does not have custom spliterator implementation. Using HashMap is much better. Nevertheless, the provided code everything should work correctly, either with HashMap or with Hashtable (though much less efficient with Hashtable). I suspect that the problem is inside not shown here cakeService.updateCakeOrderStatusAndPrice which is likely to be not thread-safe.

Upvotes: 3

Related Questions