Reputation: 1
I'm building a solution that processes the data from Lambda(Python 2.7) through kinesis stream and firehose to Elastic Search domain. Data is stored in Python dictionary and dumped as a JSON to Kinesis
dataDictionary = {
"precipitationType": precipitationType,
"location": location,
"humidity" : humidity,
"groundTemp": groundTemp,
"airTemp": airTemp,
"windSpeed": windSpeed,
"windDirection": windDirection,
"measureDate": parsedMeasureDate,
"systemDate": systemDate,
"stationGeoLatitude": stationGeoLatitude,
"stationGeoLongitude": stationGeoLongitude
}
#Push data to AWS Kinesis Stream
res = kinesis.put_record(StreamName = LocalStreamName,Data=json.dumps(dataDictionary),PartitionKey='systemDate')
Process is succesful but in want to display the results on map in Kibana I only have two float fields and no geo_point/geohash fields
I cannot figure out how to map them in AWS ElasticSearch Service. I found some documentation about mapping but I have no idea how to use it inside AWS. Maybe I should pass this data in other way in Python code?
Upvotes: 0
Views: 696
Reputation: 1770
You have to use mappings and tell elasticsearch to map your 2 fields as a geo-point location:
https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html
You will have to reindex your data, but first specify your mappings.
You could do it using python client, or post the json mapping manualy:
PUT your_index
{
"mappings": {
"your_type": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}
}
Upvotes: 1