Reputation: 3107
From this StackOverflow thread, I know how to obtain and use the log4j logger in pyspark like so:
from pyspark import SparkContext
sc = SparkContext()
log4jLogger = sc._jvm.org.apache.log4j
LOGGER = log4jLogger.LogManager.getLogger('MYLOGGER')
LOGGER.info("pyspark script logger initialized")
Which works fine with the spark-submit script.
My question is how to modify the log4j.properties file to configure the log level for this particular logger or how to configure it dynamically?
Upvotes: 9
Views: 8990
Reputation: 1427
There are other answers on how to configure log4j via the log4j.properties file, but I haven't seen anyone mention how to do it dynamically, so:
from pyspark import SparkContext
sc = SparkContext()
log4jLogger = sc._jvm.org.apache.log4j
LOGGER = log4jLogger.LogManager.getLogger('MYLOGGER')
# same call as you'd make in java, just using the py4j methods to do so
LOGGER.setLevel(log4jLogger.Level.WARN)
# will no longer print
LOGGER.info("pyspark script logger initialized")
Upvotes: 11