nskalis
nskalis

Reputation: 2382

logstash + elasticsearch | bug?

Would you be so kind to help me solve the following issue : what the following means ? It seems that it cannot connect to the Elasticsearch local node. But why ?

logstash]# bin/logstash -f logstash_exabgp.cfg --debug --verbose
Using milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2-modified/plugin-milestones {:level=>:warn}
Registering file input {:path=>["/var/log/messages"], :level=>:info}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_452905a167cf4509fd08acb964fdb20c", :path=>["/var/log/messages"], :level=>:info}
Grok patterns path {:patterns_dir=>["/opt/logstash/patterns/*"], :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/firewalls", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/grok-patterns", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/haproxy", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/java", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/junos", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/linux-syslog", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/mcollective", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/mcollective-patterns", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/mongodb", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/nagios", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/postgresql", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/redis", :level=>:info}
Grok loading patterns from file {:path=>"/opt/logstash/patterns/ruby", :level=>:info}
Match data {:match=>{"message"=>"%{SYSLOGTIMESTAMP:timestamp}%{GREEDYDATA}ExaBGP:%{SPACE}%{GREEDYDATA:msg}"}, :level=>:info}
Grok compile {:field=>"message", :patterns=>["%{SYSLOGTIMESTAMP:timestamp}%{GREEDYDATA}ExaBGP:%{SPACE}%{GREEDYDATA:msg}"], :level=>:info}
Pipeline started {:level=>:info}
New Elasticsearch output {:cluster=>nil, :host=>"127.0.0.1", :port=>"9200", :embedded=>false, :protocol=>"http", :level=>:info}
Automatic template management enabled {:manage_template=>"true", :level=>:info}
Using mapping template {:template=>"{  \"template\" : \"logstash-*\",  \"settings\" : {    \"index.refresh_interval\" : \"5s\"  },  \"mappings\" : {    \"_default_\" : {       \"_all\" : {\"enabled\" : true},       \"dynamic_templates\" : [ {         \"string_fields\" : {           \"match\" : \"*\",           \"match_mapping_type\" : \"string\",           \"mapping\" : {             \"type\" : \"string\", \"index\" : \"analyzed\", \"omit_norms\" : true,               \"fields\" : {                 \"raw\" : {\"type\": \"string\", \"index\" : \"not_analyzed\", \"ignore_above\" : 256}               }           }         }       } ],       \"properties\" : {         \"@version\": { \"type\": \"string\", \"index\": \"not_analyzed\" },         \"geoip\"  : {           \"type\" : \"object\",             \"dynamic\": true,             \"path\": \"full\",             \"properties\" : {               \"location\" : { \"type\" : \"geo_point\" }             }         }       }    }  }}", :level=>:info}
NoMethodError: undefined method `tv_sec' for nil:NilClass
        sprintf at /opt/logstash/lib/logstash/event.rb:230
           gsub at org/jruby/RubyString.java:3041
        sprintf at /opt/logstash/lib/logstash/event.rb:216
        receive at /opt/logstash/lib/logstash/outputs/elasticsearch.rb:308
         handle at /opt/logstash/lib/logstash/outputs/base.rb:86
     initialize at (eval):72
           call at org/jruby/RubyProc.java:271
         output at /opt/logstash/lib/logstash/pipeline.rb:266
   outputworker at /opt/logstash/lib/logstash/pipeline.rb:225
  start_outputs at /opt/logstash/lib/logstash/pipeline.rb:152

while the configuration file is as follows :

logstash]# cat logstash_exabgp.cfg 
input   {
    file    {
        path    =>  ["/var/log/messages"]
    }
}
filter  {
    if [message] !~ /ExaBGP/ { 
            drop { } 
    }
    grok    {
        match   =>  [ "message", "%{SYSLOGTIMESTAMP:timestamp}%{GREEDYDATA}ExaBGP:%{SPACE}%{GREEDYDATA:msg}"]
        remove_field    =>  [ "message", "host", "path", "@timestamp", "@version" ]
    }
    date    {
        match   =>  ["logdate", "MMM dd HH:mm:ss"]
    }
}
output  {
#   file    {
#       path    =>  "NIKOS.txt"
#   }
#   stdout { codec => rubydebug }
    elasticsearch { 
        host    =>  "127.0.0.1"
        protocol    =>  http    
    }
}

Upvotes: 1

Views: 558

Answers (3)

katy
katy

Reputation: 181

I had that problem too, so i removed file from input , i used :

input 
 {
   stdin {

    }
       }
 . . . 

And you must execute logstash in that way :

bin/logstash --config /home/logstash/conf/ex.conf < /home/var/log/messages

Because file in input it's not working anymore.

Upvotes: 1

nskalis
nskalis

Reputation: 2382

Any @-prefixed fiedl is used internally by logstash, remobing them tends to cause errors.

Upvotes: 1

MUFC
MUFC

Reputation: 2013

I assume this the first time you are running the logstash. The problem here is,logstash is not able to find information about the file you are referring to.

use the following code and try to give absolute path for the files you want to parse.

file {

       path =>  ["/var/log/messages"]
       start_position => "beginning" 

}

Upvotes: 0

Related Questions