Reputation: 6141
I am experimenting with ELK stack, and so far so good. I have small issue that I am trying to resolve. I have a field named 'message' coming from filebeat. Inside that field is a string with data for logging. Sometimes that message field might contain this line:
successfully saved with IP address: [142.93.111.8] user: [[email protected]]
I would like to apply a filter, so the logstash send this to the Elastic Search:
successfully saved with IP address: [] user: [[email protected]]
This is what I currently have in Logstash configuration:
input {
beats {
port => "5043"
codec => json
}
}
filter {
if [message] =~ /IP address:/{
mutate { add_tag => "whats happening" }
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}
Something else cought my attention. ELK is able to do text filtering on Filebeat level and also on Logstash level. Which one is the most usual scenario? Is Filebeat filtering more suitable?
Upvotes: 0
Views: 603
Reputation: 6141
I have found the correct solution in my case:
mutate {
gsub => ["message", "address: \[(.*?)]", "address:[not indexable]"]
}
Hopefully someone will find it usefull.
Upvotes: 0