Sean Lynch
Sean Lynch

Reputation: 2983

Filebeat -> Logstash indexing documents twice

I have Nginx logs being sent from Filebeat to Logstash which is indexing them into Elasticsearch.

Every entry gets indexed twice. Once with the correct grok filter and then again with no fields found except for the "message" field.

This is the logstash configuration.

02-beats-input.conf

input {
beats {
    port            => 5044
    ssl             => false
    }
}

11-nginx-filter.conf

filter {
    if [type] == "nginx-access" {
        grok {
            patterns_dir => ['/etc/logstash/patterns']
            match => {"message" => "%{NGINXACCESS}"
        }
        date {
            match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z", "d/MMM/YYYY:HH:mm:ss Z" ]
        }
    }
}

Nginx Patterns

NGUSERNAME [a-zA-Z\.\@\-\+_%]+
NGUSER %{NGUSERNAME}
NGINXACCESS %{IPORHOST:clientip}\s+%{NGUSER:ident}\s+%{NGUSER:auth}\s+\[%{HTTPDATE:timestamp}\]\s+\"%{WORD:verb}\s+%{URIPATHPARAM:request}\s+HTTP/%{NUMBER:httpversion}\"\s+%{NUMBER:response}\s+(?:%{NUMBER:bytes}|-)\s+(?:\"(?:%{URI:referrer}|-)\"|%{QS:referrer})\s+%{QS:agent}

30-elasticsearc-output.conf

output {
    elasticsearch {
        hosts => ["elastic00:9200", "elastic01:9200", "elastic02:9200"]
        manage_template => false
        index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
        document_type => "%{[@metadata][type]}"
    }
}

Upvotes: 0

Views: 159

Answers (1)

Sean Lynch
Sean Lynch

Reputation: 2983

Check your filebeat configuration!

During setup I had accidentally un-commented and configured the output.elasticsearch section of the filebeat.yml.

I then also configured the output.logstash section of the configuration but forgot to comment out the elasticsearch output section.

This caused one entry to be sent to logstash where it was grok'd and another one to be sent directly to elasticsearch.

Upvotes: 0

Related Questions