Reputation: 1950
I have the following example input data "doc".
Indexing with python elasticsearch:
doc = {
"node": [{
"table": [{
"node-table": {
"packets-up": 18440044073709951615,
"packets-down": 18447644073709991615
}
}]
}]
}
from elasticsearch import Elasticsearch
es = Elasticsearch(hosts="localhost:9200")
res = es.indices.create(index="doc")
es.index(index="doc", doc_type='docs', body=doc)
While trying to index the data with dynamic mapping, I get the following error:
Traceback (most recent call last):
File "test.py", line 60, in <module>
es.index(index="doc_test", doc_type='docs', body=doc)
File "/Users/user/Projects/2018/es_test/.env/lib/python2.7/site-packages/elasticsearch/client/utils.py", line 76, in _wrapped
return func(*args, params=params, **kwargs)
File "/Users/user/Projects/2018/es_test/.env/lib/python2.7/site-packages/elasticsearch/client/__init__.py", line 319, in index
_make_path(index, doc_type, id), params=params, body=body)
File "/Users/user/Projects/2018/es_test/.env/lib/python2.7/site-packages/elasticsearch/transport.py", line 318, in perform_request
status, headers_response, data = connection.perform_request(method, url, params, body, headers=headers, ignore=ignore, timeout=timeout)
File "/Users/user/Projects/2018/es_test/.env/lib/python2.7/site-packages/elasticsearch/connection/http_urllib3.py", line 186, in perform_request
self._raise_error(response.status, raw_data)
File "/Users/user/Projects/2018/es_test/.env/lib/python2.7/site-packages/elasticsearch/connection/base.py", line 125, in _raise_error
raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)
elasticsearch.exceptions.RequestError: RequestError(400, u'mapper_parsing_exception', u'failed to parse')
I assume that this is due to the numeric values not accommodated by "long" data type.
How can we handle these numeric values.
ElasticSearch Trace:
curl -H 'Content-Type: application/json' -XPOST 'http://localhost:9200/doc/docs?pretty' -d '
{
"node": [
{
"table": [
{
"node-table": {
"packets-down": 18447644073709991615,
"packets-up": 18440044073709951615
}
}
]
}
]
}
'
Response:
{
"error" : {
"root_cause" : [
{
"type" : "mapper_parsing_exception",
"reason" : "failed to parse"
}
],
"type" : "mapper_parsing_exception",
"reason" : "failed to parse",
"caused_by" : {
"type" : "illegal_state_exception",
"reason" : "No matching token for number_type [BIG_INTEGER]"
}
},
"status" : 400
}
Upvotes: 0
Views: 237
Reputation: 217464
You have two options to circumvent this problem.
Option A. Store these values as float (or double) instead of long.
First you need to make sure that your packets-down
and packets-up
fields are mapped as float
(or double
), like this:
PUT doc_test
{
"mappings": {
"docs": {
"dynamic_templates": [
{
"bignums": {
"match": "packets*",
"mapping": {
"type": "float"
}
}
}
]
}
}
}
And then you need to wrap the number in double quotes and send them as a string:
doc = {
"node": [{
"table": [{
"node-table": {
"packets-up": "18440044073709951615",
"packets-down": "18447644073709991615"
}
}]
}]
}
That will work and you'll be able to sum your packet field as any other field containing numeric values.
Option B. Enable numeric detection (disabled by default)
PUT doc_test
{
"mappings": {
"docs": {
"numeric_detection": true
}
}
}
And then you also need to wrap the number in double quotes and send them as a string:
doc = {
"node": [{
"table": [{
"node-table": {
"packets-up": "18440044073709951615",
"packets-down": "18447644073709991615"
}
}]
}]
}
As a result, the big numbers will be mapped as float
Upvotes: 1