Karthikeyan Kesavaraj
Karthikeyan Kesavaraj

Reputation: 91

logstash-forwarder missing log events for frequently rolling log files

Logstash forwarder missing log events for frequently rolling log files.

My java application has lot of batch jobs and real time data feeds. Basically its a enterprise integration application.

My application creates log files for each batch jobs and using log4j2 as a logging framework. The log4j2 is customized to create log files dynamically based on the feed name that I provide through bash script invocation and uses log4j2 routing adapter for log file creation.

Here is the log4j2 configuration.

<?xml version="1.0" encoding="UTF-8"?> <Configuration xmlns="http://logging.apache.org/log4j/2.0/config" shutdownHook="disable"> <Properties>
        <Property name="log-path">${env:LOGPATH}</Property> </Properties> <Appenders>
     <Routing name="RoutingAppender" ignoreExceptions="true">
        <Routes pattern="${ctx:logFileName}">
            <Route>
                <RollingFile name="${ctx:logFileName}"
                    fileName="${log-path}/${ctx:logFileName}.log"
                    filePattern="${log-path}/${ctx:logFileName}_%i.log.gz">
                    <BurstFilter level="DEBUG" rate="16" maxBurst="100"/>
                    <PatternLayout pattern="%d{MM-dd-yyyy HH:mm:ss,SSS}|%level|[%thread]|[${env:USERNAME}]|[%C]|- %msg%n"/>
                    <SizeBasedTriggeringPolicy size="50MB" />
                    <DefaultRolloverStrategy max="20"/>
                </RollingFile>
            </Route>
        </Routes>
    </Routing> </Appenders> <Loggers>
    <Root level="INFO">
        <AppenderRef ref="RoutingAppender"/>
    </Root> </Loggers> </Configuration>

This configuration is working perfectly and creates log files based on the loggerFile name provied

There are few batch jobs that runs longer and process millions of data. The business requirement is to capture data thats flow through my application and log it for audit purpose.

However, for most of the jobs, I do not see any log events lost for most of the feeds. But fror some feeds, where the size based rollup is happening every 3 min. because of the amount of data processed,There are log events lost.

I may not be able to increate the sizeBasedTriggeringPoicy because at any given time there may be more than 1500 feeds processing, and increasing the sizeBasedTriggeringPolicy may lead to disk space issue.

Here is my logstash forwarder configuration

{
  "network": {
    "servers": [ "Server1:5004","Server2:5004","Server3:5004"],
    "ssl certificate": "./certs/logstash.crt",
    "ssl key": "./certs/logstash.key",
    "ssl ca": "./certs/logstash.crt",
    "timeout": 15
  },

  "files": [
    {
      "paths": [
        "/opt/logs/*.log"],
      "fields": { "type": "application" },
      "dead time": "10m"
    }
  ]
}

Here are the args set for logstash forwarder

./logstash-forwarder -quiet=true -config=./logstash-forwarder.conf

Kibana, shows evidence, the logs received every 2 min. But my application is processing data continuously and could see the events logged in my log file. From my analysis, I find the offset of previously rolled file is taken and been used for the current log file
enter image description here

What configurations should be added to the logstash forwarder in order to capture all the log events.

I have tried -tail=true, which did not work.

I am using 0.4.0 version of logstash-forwarder and my Operating system is RHEL 5.X

Upvotes: 1

Views: 618

Answers (1)

Karthikeyan Kesavaraj
Karthikeyan Kesavaraj

Reputation: 91

At the time of this question, FileBeats was in development. Elastic team suggested to use FileBeats as logstash-forwarder will be sunset.

Upvotes: 0

Related Questions