Reputation: 1961
I am loading data to Elasticsearch and I've got index.mapping.total_fields.limit
. I totally don't understand why I've got this error, my class which I am loading to ES looks like:
@Document(type = "article", indexName = "data")
data class Product(
@Id
val id: String? = null,
val category: String,
val name: String,
val imagesUrls: List<String>,
val parameters: Map<String, List<String>>?
)
I added already around 3k of products, after that I got this error. Can you explain to me, why I got this? I thought I had only 5 fields in my Product class.
Upvotes: 0
Views: 566
Reputation: 19421
This behaviour comes from the fact that you don't specify how the parameters property should be stored. Let's assume, that in one entity, paraneters is a map from "foo" to some data. This leads to the following mappings to be created (only showing the relevant part):
{
"article": {
"mappings": {
"properties": {
"parameters": {
"properties": {
"foo": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}
}
If the next entity had parameters
from bar to something, a new mapping entry for "bar" is created. And once the number of keys in your map exceeds the limit, you get that error.
Edit 03.08.2020:
You could create a class Parameter
:
data class Parameter(
val key: String,
val values: List<String>
)
and change the corresponding property in the Product
class to
@Field(type = FieldType.Nested)
val parameters: List<Parameter>
You lose the unique-key guarantee that the Map
has and would need to check this somewhere in your code.
Personally, I would have in my application a domain layer where I would use a Map
and a persistence layer where I use the specific version needed to store the data - here with a List
. And when converting data from domain to persistence layer and back, I would make this transformation.
Upvotes: 2