Reputation: 601
In our elastic index we have daily news documents and we are running aggregations for these documents. But, after 2 consecutive run elasticsearch returns not enough memory exception. Now we have increased the heap size for elastic but is there any solution other then increasing the ram for elastic?
The field properties which is used for aggregations;
"detail_stop": {
"type": "string",
"store": true,
"analyzer": "stop_analyzer"
}
The query for aggregation;
{
"from": 0,
"size": 5000,
"query": {
"bool": {
"must": [
{
"range": {
"date": {
"gte": "now-0d/d"
}
}
}
]
}
},
"aggs": {
"words": {
"terms": {
"size": 5000,
"field": detail_stop,
"min_doc_count": 3
}
}
}
}
Currently we have an elastic cluster with 1 node(8core 2.5ghz , 32gb) and ES_HEAP_SIZE = 16g(elastic have 16gb memory). How can we reduce usage of memory and increase performance?
Upvotes: 1
Views: 4870
Reputation: 6357
If you are using Field Cache, then you have to provide more memory for aggregations to work. You can try out the tips mentioned here to monitor and limit memory consumption.
With newer versions of Elasticsearch, doc values are becoming the norm instead of field data cache. They are slightly slower compared to field data but will solve the memory problem. Mind you, to use doc values you will have to re-index all your data again. Read more about it here.
Upvotes: 0