Reputation: 4563
I'm running Elastic v7.4.2 and have a process that produces JSON records with uppercase keys:
{
"TS": 1572559271.768669,
...
"RESP_GEO": {
"CITY": "Ashburn",
"COUNTRY": "United States",
"SUBDIVISION": "Virginia",
"LOCATION": {
"LON": -77.4728,
"LAT": 39.0481
}
}
}
I created a dynamic template to automagically handle the timestamp and locations:
{
"dynamic_templates": [
{
"dates": {
"mapping": {
"type": "date"
},
"match": "TS"
}
},
{
"locations": {
"mapping": {
"type": "geo_point"
},
"match": "*LOCATION"
}
}
]
}
The ingest job throws an error:
field [RESP_GEO.LOCATION] of type [geo_point], caused_by "{"type":"parse_exception","reason":"field must be either [lat], [lon] ...
The LAT
and LON
attributes in the JSON are uppercase. Elastic wants them to be lowercase. Is there a way to force Elastic to ignore the case? Or convert all the keys into lowercase, inside Elastic, as a pre-indexing step?
Upvotes: 1
Views: 269