Reputation: 2250
Premise: I'm a newbye on java questions, but I'm trying to understand.
I'm experiencing "OutOfMemoryError: Java heap space" in a SOLR instance under high load (many concurrent requests in read only).
I've found that the general solution in these situations is to increase maximum heap size with -Xmx
parameter, but looking at figures on jconsole I'm observing that only 50% of 1G total heap is used. Only Eden space and Survivor space have sometimes reached 100%, but I've understood this is normal.
Are there cases in which we can have OutofMemoryError in heap space even if it isn't full? Which parameters can I set to avoid this?
Upvotes: 0
Views: 1400
Reputation: 86
Just check if you using JREx32 on OSx64. It can cause your problem. I just now up my solr server after viewing logs.
See on a java dump log on something like this:
tenured generation total 174784K, used 174765K [0x09750000, 0x14200000, 0x14200000) the space 174784K, 99% used [0x09750000, 0x141fb588, 0x141fb600, 0x14200000)
It seems that Garbage Collector is working not so good for somths java generations in different bit of JRE and OS.
Upvotes: 1
Reputation: 64
Sure it's possible. A somewhat common case is if you have really large objects, then sometimes the JVM will automatically instantiate them in tenured generation. If there isn't a contiguous block available to instantiate it in then you'll get the heap space error.
I hit a similar problem with Solr about a year ago, and it was related to the heavy use of dynamic fields in the import, which were in turn trying to create many (and large) objects. Do you use a lot of dynamic fields?
I think the commenters have it right, you should take a heap dump and try to figure out what is going on that way. When you get to Solr tuning you'll likely be tweaking the heap space sizes anyway.
I found this page pretty useful: http://www.kdgregory.com/index.php?page=java.outOfMemory
Upvotes: 0