Runtimeterror
Runtimeterror

Reputation: 1

How do i configure azure databricks with log4j2

log4j:WARN Failed to set property [rollingPolicy] to value "org.apache.log4j.rolling.TimeBasedRollingPolicy". 
log4j:WARN Please set a rolling policy for the DatabricksRollingFileAppender named 'publicFile'
log4j:WARN Failed to set property [rollingPolicy] to value "org.apache.log4j.rolling.TimeBasedRollingPolicy". 
log4j:WARN Please set a rolling policy for the DatabricksRollingFileAppender named 'privateFile'
log4j:WARN Failed to set property [rollingPolicy] to value "org.apache.log4j.rolling.TimeBasedRollingPolicy". 
log4j:WARN Please set a rolling policy for the DatabricksRollingFileAppender named 'product'
log4j:WARN Failed to set property [rollingPolicy] to value "org.apache.log4j.rolling.TimeBasedRollingPolicy". 
log4j:WARN Please set a rolling policy for the DatabricksRollingFileAppender named 'metrics'
log4j:WARN Failed to set property [rollingPolicy] to value "org.apache.log4j.rolling.TimeBasedRollingPolicy". 
log4j:WARN Please set a rolling policy for the DatabricksRollingFileAppender named 'usage

Upvotes: 0

Views: 2213

Answers (2)

Klugscheißer
Klugscheißer

Reputation: 1614

As of Datatbricks runtime 11.0 with Spark 3.3.0, log4j has been upgraded to log4j2. I haven't seen official documentation for custom log support but this is what's been working for me. Similar with log4j 1 support, you need a custom init script to run on the cluster and create a log4j2.properties file. This script might look something like:

#! /bin/bash

set -euxo pipefail

echo "Running on the driver? ${DB_IS_DRIVER}"
echo "Driver ip: ${DB_DRIVER_IP}"

cat >>/databricks/spark/dbconf/log4j/driver/log4j2.properties <<EOL

appender.customFile.type = RollingFile
appender.customFile.name = customFile
appender.customFile.layout.type = PatternLayout
appender.customFile.layout.pattern = %d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n%ex
appender.customFile.filePattern = logs/log4j.custom.%d{yyyy-MM-dd-HH}.log.gz
appender.customFile.policies.type = Policies
appender.customFile.policies.time.type = TimeBasedTriggeringPolicy
appender.customFile.policies.time.interval = 1
appender.customFile.fileName = logs/stdout.custom-active.log

logger.custom=DEBUG, customFile
logger.custom.name = com.custom
logger.custom.additivity = true

EOL

Upvotes: 1

In databricks , you can check the complete details on databricks with log4j2 in here

Upvotes: -1

Related Questions