h.l.m
h.l.m

Reputation: 13475

logstash and elasticsearch geo_point

I am using logstash to input geospatial data from a csv into elasticsearch as geo_points.

The CSV looks like the following:

$ head -5 geo_data.csv
"lon","lat","lon2","lat2","d","i","approx_bearing"
-1.7841,50.7408,-1.7841,50.7408,0.982654,1,256.307
-1.7841,50.7408,-1.78411,50.7408,0.982654,1,256.307
-1.78411,50.7408,-1.78412,50.7408,0.982654,1,256.307
-1.78412,50.7408,-1.78413,50.7408,0.982654,1,256.307

I have create a mapping template that looks like the following:

$ cat map_template.json

{
  "template": "base_map_template",
  "order":    1,
  "settings": {
    "number_of_shards": 1
  },
    {
      "mappings": {
        "base_map": {
          "properties": {
            "lon2": { "type" : "float" },
            "lat2": { "type" : "float" },
            "d": { "type" : "float" },
            "appox_bearing": { "type" : "float" },
            "location": { "type" : "geo_point" }
          }
        }
      }
    }
}

My config file for logstash has been set up as follows:

$ cat map.conf

input {
  stdin {}
}

filter {
  csv {
      columns => [
        "lon","lat","lon2","lat2","d","i","approx_bearing"
      ]
  }

  if [lon] == "lon" {
      drop { }
  } else {
      mutate {
          remove_field => [ "message", "host", "@timestamp", "@version" ]
      }

      mutate {
          convert => { "lon" => "float" }
          convert => { "lat" => "float" }
          convert => { "lon2" => "float" }
          convert => { "lat2" => "float" }
          convert => { "d" => "float" }
          convert => { "i" => "integer"}
          convert => { "approx_bearing" => "float"}
      }

      mutate {
          rename => {
              "lon" => "[location][lon]"
              "lat" => "[location][lat]"
          }
      }
  }
}

output {
#  stdout { codec => rubydebug }
  stdout { codec => dots }
  elasticsearch {
      index => "base_map"
      template => "map_template.json"
      document_type => "node_points"
      document_id => "%{i}"
  }
}

I then try and use logstash to input the csv data into elasticsearch as geo_points using the following command:

$ cat geo_data.csv | logstash-2.1.3/bin/logstash -f map.conf

I get the following error:

Settings: Default filter workers: 16
Unexpected character ('{' (code 123)): was expecting double-quote to start field name
at [Source: [B@278e55d1; line: 7, column: 3]{:class=>"LogStash::Json::ParserError", :level=>:error}
Logstash startup completed
....Logstash shutdown completed

What am I missing?

Upvotes: 0

Views: 1491

Answers (1)

darktachyon
darktachyon

Reputation: 268

wayward "{" on line 7 of your template file

Upvotes: 1

Related Questions