Reputation: 45
I want to make a little production calculator. I have a Part A which need 7s and its made out of x Part B which need 3s and y Part C which need 2s. Part C is made out of n Part D. So I have a recursion in Class Part and and its number of usage as "recipe". My Enitity Class
@Entity
@Data
public class ProductEntity {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@Column(unique=true)
private String name;
private double productionCycleTimeInSeconds;
private int batchPerProductionCycle;
@Enumerated(EnumType.ORDINAL)
private FacilityEnum facility;
@ElementCollection
@JoinTable(name = "end_product_recipe", joinColumns = @JoinColumn(name = "end_product_id"))
@MapKeyColumn(name = "resourceId")
@Column(name = "usage_count")
private Map<ProductEntity, Integer> recipe;
}
For Testing I wanted to init the table with 3 basics
ProductEntity ironOre = new ProductEntity();
ironOre.setFacility(FacilityEnum.MINING_MACHINE);
ironOre.setName("Iron Ore");
ironOre.setProductionCycleTimeInSeconds(1.0);
ironOre.setBatchPerProductionCycle(1);
productRepository.save(ironOre);
ProductEntity ironIngot = new ProductEntity();
ironIngot.setFacility(FacilityEnum.SMELTER);
ironIngot.setName("Iron Ingot");
ironIngot.setProductionCycleTimeInSeconds(1.0);
ironIngot.setBatchPerProductionCycle(1);
Map<ProductEntity, Integer> recipe = new HashMap<>();
recipe.put(ironOre, 1);
ironIngot.setRecipe(recipe);
productRepository.save(ironIngot);
ProductEntity gear = new ProductEntity();
gear.setFacility(FacilityEnum.ASSEMBLING_MACHINE);
gear.setName("Gear");
gear.setProductionCycleTimeInSeconds(1.0);
gear.setBatchPerProductionCycle(1);
recipe.clear();
recipe.put(ironIngot, 1);
gear.setRecipe(recipe);
productRepository.save(gear);
When Code get to last line (productRepository.save(gear);) i get Error:
java.lang.StackOverflowError
at java.util.HashMap$HashIterator.<init>(HashMap.java:1427)
at java.util.HashMap$EntryIterator.<init>(HashMap.java:1477)
at java.util.HashMap$EntrySet.iterator(HashMap.java:1014)
at java.util.AbstractMap.hashCode(AbstractMap.java:528)
at org.hibernate.collection.internal.PersistentMap.hashCode(PersistentMap.java:574)
at com.hoernerice.dspcalculator.entity.ProductEntity.hashCode(ProductEntity.java:12)
at java.util.Objects.hashCode(Objects.java:98)
at java.util.HashMap$Node.hashCode(HashMap.java:297)
at java.util.AbstractMap.hashCode(AbstractMap.java:530)
at org.hibernate.collection.internal.PersistentMap.hashCode(PersistentMap.java:574)
at com.hoernerice.dspcalculator.entity.ProductEntity.hashCode(ProductEntity.java:12)
at java.util.Objects.hashCode(Objects.java:98)
at java.util.HashMap$Node.hashCode(HashMap.java:297)
at java.util.AbstractMap.hashCode(AbstractMap.java:530)
at org.hibernate.collection.internal.PersistentMap.hashCode(PersistentMap.java:574)
at com.hoernerice.dspcalculator.entity.ProductEntity.hashCode(ProductEntity.java:12)
at java.util.Objects.hashCode(Objects.java:98)
at java.util.HashMap$Node.hashCode(HashMap.java:297)
at java.util.AbstractMap.hashCode(AbstractMap.java:530)
at org.hibernate.collection.internal.PersistentMap.hashCode(PersistentMap.java:574)
at com.hoernerice.dspcalculator.entity.ProductEntity.hashCode(ProductEntity.java:12)
....
it repeats so i guess i have an endless loop. but how? where is my mistake? For the first 2 items i have the rows as accepted in database. in the join table are "just" ids as wanted. So that work as accepted
Solved by replacing
recipe.clear()
with
recipe = new HashMap<>();
Upvotes: 2
Views: 388
Reputation: 11860
HashMap.hashCode()
is the responsible of the overflow.
HashMap
extends from AbstractMap
, which in his hashCode()
method, sums the hash code of each entry:
public int hashCode() {
int h = 0;
Iterator<Entry<K,V>> i = entrySet().iterator();
while (i.hasNext())
h += i.next().hashCode();
return h;
}
Why the infinite recursive loop?
ironIngot.setRecipe(recipe);
and the following
recipe.put(ironIngot, 1);
Is making the HashMap
an entry of itself. So when it calls hashCode()
it calls, again, his hashcode method when calculating its entrys' hashcode: next().hashCode()
. And again, and again...in an infinite loop.
Simplification of the overflow loop
MapX
+--------------+
/| /|
/ | / |
*--+-----------* |
| | | |
| |- [MapX]------|-- ----
| | | | |
| +-----------+--+ |
| / | / |
|/ |/ |
*--------------* |
MapX V
+--------------+
/| /|
*-+------------* |
| | | |
| |- [MapX]----|-|---------
| | | | |
| +------------+-+ |
|/ |/ |
*--------------* |
MapX V
+--------------+
/| /|
*-+------------* |
| | | |
| |- [MapX]--------------
| | | | |
| +------------+-+ |
|/ |/ |
*--------------* V
(Road to Overflow Town)
As you commented, creating a new HashMap
avoids this scenario, as now there's no map that contains itself as one of its entries, you broke the reference chain.
The code that explains this behaviour is this one, which in essence, is making the same as yours:
Map<Object,String> map = new HashMap<>();
map.put(map,"letsLooop");
map.hashCode();
This snippet will also lead to a stackoverflow
error, due to the infinite loop in the AbstractMap
's hashCode()
implementation, which will call its own hashCode method in a deadlocking loop that results in a beautiful StackOverflow
.
Upvotes: 4