Reputation: 3616
I'm using Hadoop in my application, but just before the program exits I get this error java.lang.OutOfMemoryError: Java heap space
I already modified the mapred-site.xml
and added this property to it
<property>
<name>mapred.child.java.opts</name>
<value>-Xmx4096m</value>
</property>
but still the exception appears.
I used this command in terminal: java -XX:+PrintFlagsFinal -version | grep -iE 'HeapSize|PermSize|ThreadStackSize'
and this was the result:
uintx AdaptivePermSizeWeight = 20 {product}
intx CompilerThreadStackSize = 0 {pd product}
uintx ErgoHeapSizeLimit = 0 {product}
uintx HeapSizePerGCThread = 87241520 {product}
uintx InitialHeapSize := 1054841728 {product}
uintx LargePageHeapSizeThreshold = 134217728 {product}
uintx MaxHeapSize := 16877879296 {product}
uintx MaxPermSize = 174063616 {pd product}
uintx PermSize = 21757952 {pd product}
intx ThreadStackSize = 1024 {pd product}
intx VMThreadStackSize = 1024 {pd product}
java version "1.6.0_31"
OpenJDK Runtime Environment (IcedTea6 1.13.3) (6b31-1.13.3-1ubuntu1~0.12.04.2)
OpenJDK 64-Bit Server VM (build 23.25-b01, mixed mode)
If anyone could please advise how to fix this issue please.
Upvotes: 0
Views: 912
Reputation: 50
Your problem is a memory leak.
You should consider reviewing your code to see what's causing that resource leak. Usually, it's caused by instances where the GC is not able to remove the data from your memory.
Upvotes: 1