Danubio
Danubio

Reputation: 103

Elasticsearch circuit_breaking_exception problem with Python CGI

I am using an Elasticsearch instance with a Python 3 CGI Script in an Apache server. The web application generally works fine, but sometimes, with certain queries I get this Python exception:

TransportError: TransportError(429, 'circuit_breaking_exception', '[parent] Data too large, data for [<http_request>] would be [987816208/942mb], which is larger than the limit of [986932838/941.2mb], real usage: [987816208/942mb], new bytes reserved: [0/0b]')

It does not always happen with the same query. Sometimes it works fine and I get the result, sometimes it fails.

The Python code:

#!/usr/bin/python3

import cgitb
import cgi
import re
from elasticsearch import Elasticsearch
from datetime import datetime
import json
import io
import sys

def enc_print(string='', encoding='utf8'):
    sys.stdout.buffer.write(string.encode(encoding))

es = Elasticsearch([{'host': 'localhost', 'port': 9200}])

cgitb.enable(display=0, logdir='/home/myname/Desktop/')
form = cgi.FieldStorage() 

if form.getvalue('dictionary') == 'One':
    dictionary = 'one'
elif form.getvalue('dictionary') == 'Two':
    dictionary = 'two'
else:
    dictionary = 'three'

res = es.get(index=dictionary, doc_type='word', id=form.getvalue('id'))

I've tried to increase the indices.breaker.fielddata.limit to 75% and the error occurs less frequently, but still persists.I have only one replica and the Document count is 8061.

Should I increase even more the fielddata limit or is it better to try something else?

Thank you in advance.

Upvotes: 0

Views: 636

Answers (1)

hamid bayat
hamid bayat

Reputation: 2179

1 gigabyte HEAP size is very low for elasticsearch. try to increase physical RAM and HEAP size.

Upvotes: 1

Related Questions