Reputation: 538
I have a logstash client and server.
The client sends logfiles with the udp output of logstash to the server and the server also runs logstash to get these logs. On the server, I have a json filter that pulls the json formatted message in the fields of the actual log, so that elasticsearch can index them.
Here is my code from the server:
input{
udp{}
}
filter{
json {
source => "message"
}
}
output{
elasticsearch{
}
}
And from the client:
input{
file{
type => "apache-access"
path => "/var/log/apache2/access.log"
}
}
output{
udp{
host => "192.168.0.3"
}
}
This code works fine except one thing:
In some way i get the field type
twice, once as type
and once as _type
, they have the same content.
I've tried to delete the type
-field with the mutate
-filter like this:
mutate{
remove_field => [ "type" ]
}
but this filter removes both type
fields.(the _type
field is set to default: logs)
How can I keep the _type
field and remove the type
field?
Upvotes: 3
Views: 1927
Reputation: 31
It works for me in this way:
input {
file {
add_field => { "[@metadata][type]" => "apache-access" }
path => "/var/log/apache2/access.log"
}
}
filter {
......
if [@metadata][type] == "xxx" {
}
......
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
@metadata and document_type
Upvotes: 3