Gabriel Petrovay
Gabriel Petrovay

Reputation: 21884

How to change the Kafka client logging levels/preferences?

I am using a plain Java project to run (no framework) a Kafka producer and a consumer.

I am trying to control the logs generated by the KafkaProducer and KafkaConsumer code and I cannot influence it using the log4j.properties configuration:

log4j.rootLogger=ERROR,stdout

log4j.logger.kafka=ERROR,stdout
log4j.logger.org.apache.kafka.clients.producer.ProducerConfig=ERROR,stdout
log4j.logger.org.apache.kafka.common.utils.AppInfoParser=ERROR,stdout
log4j.logger.org.apache.kafka.clients.consumer.internals.AbstractCoordinator=ERROR,stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c)%n

Still I get log output like the one below, whatever settings I provide in the log4j.properties file:

[main] INFO org.apache.kafka.clients.producer.ProducerConfig - ProducerConfig values:
...
[main] INFO org.apache.kafka.clients.producer.ProducerConfig - ProducerConfig values:
...
[main] INFO org.apache.kafka.clients.producer.ProducerConfig - ProducerConfig values:
...
[main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator - [Consumer clientId=UM00160, groupId=string-group] (Re-)joining group

How can I control the logging of the Kafka clients library? What am I missing to link my log4j.properties file to the Kafka clients library logging? In order not to spam the output I have to run Maven test using: mvn test 2> /dev/null. Can I configure this via the log4j.properties.

Context:

I have the following relevant files:

── test
   ├── java
   │   └── com
   │       └── example
   │           ├── PropertyReader.java
   │           └── strings
   │               └── TestKafkaStringValues.java
   └── resources
       ├── application.properties
       └── log4j.properties

And I am trying to run the TestKafkaStringValues.java both using the Maven surefire plugin (mvn test) or the Eclipse JUnit plugin (equivalent to java ...).

For surefire I use the following configuration in the Maven pom.xml:

<plugin>
    <artifactId>maven-surefire-plugin</artifactId>
    <version>2.22.2</version>
    <configuration>
        <systemPropertyVariables>
            <log4j.configuration>file:log4j.properties</log4j.configuration>
        </systemPropertyVariables>
    </configuration>
</plugin>

and for JUnit I use the following Java VM argument: -Dlog4j.configuration=log4j.properties.

I also tried in both cases to use the absolute path to log4j.properties. Still not working.

You can see the complete code here.

Upvotes: 5

Views: 23333

Answers (1)

Gabriel Petrovay
Gabriel Petrovay

Reputation: 21884

The problem in the code above was that the Maven runtime dependencies (the actual Log4j logging implementation was missing). In the pom, the slf4j-simple logging implementation was provided. This implementation was:

  • able print the Kafka logs to stdout
  • NOT able to understand the log4j.properties or -Dlog4j.* properties.

Hence, once would have to include in a Log4J implementation. Here one would have the choice for Log4j 1.x (End of life) or Log4j2.

With the following configuration, one should be able to have a very comprehensive/granular control over the logging (including the Kafka clients).

In the pom.xml:

<dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-api</artifactId>
    <version>2.13.1</version>
</dependency>
<dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-core</artifactId>
    <version>2.13.1</version>
</dependency>
<dependency>
    <groupId>org.apache.logging.log4j</groupId>
    <artifactId>log4j-slf4j-impl</artifactId>
    <version>2.13.1</version>
    <scope>test</scope>
</dependency>

While the log4j-api and log4j-core are the minimum requirements you would need. In order for Log4j2 to be able to control/configure also libraries/components written on top of SLF4J (and Kafka client is such a library), you need to add the 3rd dependecy: log4j-slf4j-impl.

NOTE: Note that for libraries that use SLF4J 1.8.x and higher, you will need another version of this Log4j-SLF4J adapter. See this for more information.

Now regarding configuring the logging, Log4j2 is automatically loading the configuration files it it finds them, automatically searching in multiple locations.

If you place the following log4j2.properties file in the resource classpath (in src/java/resources/ for main code and in src/test/resource for test code) you will get the desired outcome:

rootLogger.level = info
rootLogger.appenderRefs = stdout
rootLogger.appenderRef.stdout.ref = STDOUT

appenders = stdout

appender.stdout.name = STDOUT
appender.stdout.type = Console
appender.stdout.layout.type = PatternLayout
appender.stdout.layout.pattern =%d{yyyy-MM-dd HH:mm:ss.SSS} [%level] [%t] %c - %m%n

loggers = kafka, kafka-consumer

logger.kafka.name = org.apache.kafka
logger.kafka.level = warn

logger.kafka-consumer.name = org.apache.kafka.clients.consumer
logger.kafka-consumer.level = info

In the above example, all logging is written to stdout and: * the root logger is logging info and above * all org.apache.kafka-prefixed loggers log warn and above * all org.apache.kafka.clients.consumer-prefixed loggers are logging info and above

Here are some extra observations when using Log4j2:

  • if you want JSON or YAML configuration you need extra dependecies
  • the JUnit plugin in Eclipse will silently terminate without any output if the Log4j configuration is not correct. mvn output will show you the error though.

Upvotes: 7

Related Questions