Reputation: 1
I have to send application logs in bulk to elastic server using custom log4j2 appender i have following log4j2 xml and appender class. Could you help me how to achieve this.
@Plugin(name = "ElasticAppender",category = Core.CATEGORY_NAME,elementType = Appender.ELEMENT_TYPE)public class ElasticAppender extends AbstractAppender {
protected ElasticAppender(String name, Filter filter, Layout<? extends Serializable> layout, boolean ignoreExceptions, Property[] properties) {
super(name, filter, layout, ignoreExceptions, properties);
}
private ConcurrentLinkedQueue<Object> logQueue = new ConcurrentLinkedQueue<>();
@PluginFactory
public static ElasticAppender createAppender(
@PluginAttribute("name") String name,
@PluginElement("Filter") Filter filter,
@PluginElement("Layout") Layout<? extends Serializable> layout,
@PluginAttribute("ignoreExceptions") boolean ignoreExceptions,
@PluginElement("properties") Property[] properties) {
return new ElasticAppender(name, filter,layout,ignoreExceptions,properties);
}
@Override
public void append(LogEvent event) {
logQueue.add(getLayout().toSerializable(event));
}}
Log4j2.xml file:
<Console name="Console-2">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %L %-5level %logger{36}- %msg%n"/>
<JsonTemplateLayout eventTemplateUri="classpath:json-logger.json"/>
</Console>
<ElasticAppender name="ElasticAppender">
<!-- <PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %L %-5level %logger{36}- %msg%n"/>-->
<JsonTemplateLayout eventTemplateUri="classpath:json-logger.json"/>
</ElasticAppender>
</Appenders>
<Loggers>
<Root level="debug">
<AppenderRef ref="Console-2"/>
</Root>
<Logger name="sensitive-logger" level="INFO"
additivity="false">
<appender-ref ref="ElasticAppender"/>
</Logger>
</Loggers>
Upvotes: 0
Views: 442
Reputation: 9151
Well, an ElassticAppender could certainly be made to work but I would not recommend doing it through a queue as you have done. Log4j provides Asynchronous Loggers and the AsyncAppender to deal with that for you.
In our environment we previously used a synchronous SocketAppender to send the logs to Logstash. However, one day ElasticSearch locked up due to an error which causes Logstash to become unresponsive, which in turn caused the app to get blocked while logging. As a result, we switched to logging JSON to a rotating file and used filebeat to ship the logs. This has worked fairly well.
In short, you really shouldn't need to write a new Appender to get your logs to ElasticSearch.
That said, if you do prefer to write to ElasticSearch directly then just have your appender send the formatted events via the Elastic client API.
Upvotes: 0