alonisser
alonisser

Reputation: 12088

EsHadoopIllegalArgumentException: invalid map received dynamic=strict errors on elasticsearch-hadoop

trying with both the dataframe Api and the rdd API

val map =collection.mutable.Map[String, String]()
map("es.nodes.wan.only") = "true"
map("es.port") = "reducted"
map("es.net.http.auth.user") = "reducted"
map("es.net.http.auth.pass") = "reducted"
map("es.net.ssl") = "true"
map("es.mapping.date.rich") = "false"
map("es.read.field.include") = "data_scope_id"

map("es.nodes") = "reducted"

val rdd = sc.esRDD("index name", map)
rdd.take(1)

But anything I try I get this error

EsHadoopIllegalArgumentException: invalid map received dynamic=strict

I've tried limiting the fields being read with es.read.field.include But even if I choose one field which I'm sure doesn't have any varient I still get this error

How can I work around this? I'll be glad for any advice

Versions:

Clarification

This is about reading from elasticsearch in spark, not indexing

Upvotes: 0

Views: 605

Answers (1)

Paulo
Paulo

Reputation: 10746

So if I understand correctly, your aim is to index the values in map to index name.

TLDR;

Update the mapping of your index to allow for new fields to be indexed. As of new the value of dynamic is strict which does not allow for new field and throw an exception.

PUT /index name/
{
  "mappings": {
    "dynamic": true
  }
}

To understand

The issue is with the mapping of your index. There is a setting called [dynamic(https://www.elastic.co/guide/en/elasticsearch/reference/current/dynamic.html) on the mapping of your index.

I bet it is set to strict, which according to the doc:

If new fields are detected, an exception is thrown and the document is rejected. New fields must be explicitly added to the mapping.

So, my understanding is, you have one or many fields that are new in your document.

Either:

  • Fix the document
  • Fix the mapping
  • Switch dynamic to true, false or runtime according to your needs

Upvotes: 0

Related Questions