Schnebdreleg
Schnebdreleg

Reputation: 87

logstash - filter logs and send to different elasticsearch cluster

let's say I've got a stack like this: logstash-forwarder -> logstash -> elasticsearch -> kibana

I wonder if it's possible to monitor a whole directory with logstash-forwarder and send the logs to different elasticsearch cluster, based on filters. Use Case:

I've got some programs that print out logs to the same directory. These logs may contain two types of messages - either "private" or debug. Again, these message can appair in the same logfiles. I know that it is possible to give certain files a different type and filter them with an if to different outputs. What I don't know is what you can do when a certain log can contain more than one type of logmessage.

Is there a way to split them? I want to restrict access to the logmessages with private information to certain users and I thought of two different elasticsearch cluster, each with its own Kibana and LDAP.

BR

Upvotes: 0

Views: 1175

Answers (1)

Frank
Frank

Reputation: 1489

Have your filter add a new field based on the message content and use that field to decide which output this message should go to.

Event flow:

logstash-forwarder --> broker ---> logstash-indexer | --> elasticsearch public
                                                    | --> elasticsearch private

Pseudo config:

input { 
    # broker input
}

filter {

    # structure message
    grok {}

    filter {
        if [action] == "login" {
            add_field => { "privacy" => 'private' }
        } else {
            add_field => { "privacy" => 'public' }
        }
    }
}

output {
    if [privacy] == "private" {
        elasticsearch { 
            # private elasticsearch instance
        }
    }

    if [privacy] == "public" {
        elasticsearch { 
            # public elasticsearch instance
        }
    }

}

Upvotes: 3

Related Questions