Reputation: 4305
I have some json (below) that I need to index in elasticsearch:
{
"array": [
"item1",
{
"name": "item2"
}
]
}
When I try and index this type of json structure I get an error:
{
"error": "MapperParsingException[failed to parse [array]]; nested: ElasticsearchIllegalArgumentException[unknown property [name]]; ",
"status": 400
}
Now I understand that elasticsearch is getting confused because array contains stings type for item 1 and then an object for item2.
My question is, how would I define a mapping to handle this type of data?
Upvotes: 1
Views: 63
Reputation: 22332
The only way that this can work is to define your index mapping so that the array
field is not enabled
and therefore make it not searchable (Elasticsearch avoids parsing that field).
curl -XPUT localhost:9200/your-index -d '{
"mappings": {
"your-type" : {
"properties" : {
"array" : {
"type" : "object",
"enabled" : false
}
}
}
}
}'
You must do this because Elasticsearch will dynamically map the type based on what it sees (if you don't tell it the mapping). The first element is going to be a string in your example, so it will map array
to a string. When it then hits the object, it does not know what to do, so it has to stop.
This is not as useless as it sounds because you can still retrieve the value(s) of the array
from the document's _source
(assuming it's stored, which it is by default). However, it does mean that the information in array
is not searchable within Elasticsearch. You can get the _source
by doing a search or a get request.
As Andrei commented, you may benefit from having a cleaner object, but sometimes this is just the way that it is.
Upvotes: 1