Adam Matan
Adam Matan

Reputation: 136141

Logstash creates enormous local log files and (although it sends then to Elasticsearch)

The problem

I have a machine with logstash on it, and another Elasticsearch-Kibana machine which stores the logs written from logstash on the first machine. Naturally, I want no logs to be kept on the origin machine and handle logging only on the Elasticsearch cluster.

Unfortunately, logstash creates huge log files on the first machine (where nothing should be kept):

enter image description here

The configuration

I have only one file under /etc/logstash on the origin machine, and as far as I can see, the configuration does not specify a local output:

input {
        tcp {
                port => 5959
                codec => json
        }
        udp {
                port => 5959
        }
}
filter{
    json{
        source => "message"
    }
}
filter{
        if [@message] == "Incoming Event" {
            mutate{
                    add_field => {
                              "location" =>  "%{@fields[location]}"
                        }
                }
        }
}
output {
        elasticsearch {
                # The host in which elasticsearch and Kibana live
                host => "some.internal.aws.ip" 
        }
}

How can I stop logstash from writing local logs by configuration? I know I can cron-del them, but I think that prevention is less error-prone.

Upvotes: 5

Views: 14420

Answers (4)

Felipe Emerim
Felipe Emerim

Reputation: 403

If you are using the docker image, logstash ships it with a default pipeline named logstash (logstash.conf). That pipeline logs everything to stdout. You MUST override it to avoid imense logs

Upvotes: 0

Realistic
Realistic

Reputation: 1068

This output is likely caused by having the following in one of your output config files:

stdout { codec => rubydebug }

After removing that from my 30-output.conf logstash stopped being so verbose

This thread led me to the answer.

Upvotes: 7

Peter Lamperud
Peter Lamperud

Reputation: 71

I had the same problem as you running on a CentOS 7 machine. No output to anything else than elasticsearch but logstash still output all the incoming messages to logstash.log and logstash.stdout

After a bit of research of the actual ruby-code it turned out that the default logging mode is very verbose.

There is however a flag (seems to be undocumented as far as I can see) called --quiet which solves the problem.

Add the flag to the LS_OPTS-variable either in the config file (/etc/sysconfig/logstash on centos) or directly in the init.d script like so:

# Arguments to pass to logstash agent
LS_OPTS="--quiet"

Upvotes: 7

Adam Matan
Adam Matan

Reputation: 136141

The logrotate solution

Unfortunately, I did not find any --verbose or --debug flags in /etc/init.d/logstash. Therefore, I tried to figure out why logrotate was not archiving the file.

/etc/logrotate.d/logstash is:

/var/log/logstash/*.log {
        daily
        rotate 7
        copytruncate
        compress
        delaycompress
        missingok
        notifempty
}

But when I tried running it, I got:

$ logrotate --force logrotate.d/logstash --verbose
Ignoring logrotate.d/logstash because of bad file mode.

A quick search came to the rescue:

sudo chmod 0644 logrotate.d/logstash

I have changed the frequency from daily to hourly and everything seems to work fine now.

This should probably not be the accepted answer. If anyone has a better solution which can prevent logstash from writing these redundant logs in the first place, I would love to accept it.

Upvotes: 1

Related Questions