pfeigl
pfeigl

Reputation: 487

Pull logs from remote server into elasticsearch

The short question is: Is it possible to pull logs (within logfiles) from a remote server and ingest them into the ELK stack.

The long story is the following:

Due to this setup, we cannot follow the normal route of installing a Filebeat on the server where the logs are stored and push the messages towards our logstash installation.

What we would like to do is something that looks somewhat like the following:

Our investigations sofar only resulted in solutions which push the log information to either logstash or elasticsearch.

One thing we do not want to do is to use fileshares to make the logfiles available directly from the intranet.

Our question is whether what we have in mind is possible at all and if so, what tools and with which setup we would accomplish this.

Upvotes: 2

Views: 2155

Answers (1)

leandrojmp
leandrojmp

Reputation: 7473

You can try the following using Kafka as a message broker

On your DMZ server you will have filebeat collecting logs and sending to a logstash instance, this logstash instace will then output your logs to kafka.

It is a simple pipeline with a beats input, your fitlers and a kafka output, if you don't want to do any enrichments on your data, you can send your logs direct to kafka from filebeat.

Your kafka broker will then listen on a port and wait for any consumer to connect and consumes the messages.

On your intranet you will need a logstash instance with a pipeline using the kafka input, this pipeline will act as a kafka consumer and will pull your messages, you can then use the elasticsearch output to store then in your intranet elasticsearch cluster.

For more information read the kafka documentation, and the documentation for the kafka input and kafka output in the logstash documentation.

Upvotes: 3

Related Questions