Reputation: 92210
On Solr 1.4, do you know what could cause an OutOfMemoryError on this class?
org.apache.lucene.search.FieldCacheImpl$StringIndexCache#1
It takes one gb of ram, and the underlying WeakHashMap only has 700 keys.
The cache configuration:
<filterCache
class="solr.FastLRUCache"
size="1024"
initialSize="0"
autowarmCount="0"/>
<queryResultCache
class="solr.FastLRUCache"
size="1024"
initialSize="0"
autowarmCount="0"/>
<documentCache
class="solr.FastLRUCache"
size="1024"
initialSize="0"
autowarmCount="0"/>
Normally my objects are quite large but not so much, not as much as more than 1mb per object!
The Xmx is 2GB.
3 million documents are indexed.
The OOM appears at query time.
Upvotes: 2
Views: 1180
Reputation: 11023
If you check your /admin/stats.jsp for the Solr core, you can see this under FieldCache:
Provides introspection of the Lucene FieldCache, this is **NOT** a cache that is managed by Solr.
You can do nothing about this cache in Solr, other than changing your queries. Most likely you are sorting on dynamic fields or doing faceting without using facet.method=enum
. See Solr/Lucene fieldCache OutOfMemory error sorting on dynamic field
Upvotes: 2