Instinct
Instinct

Reputation: 2251

How to set Spark log4j path in standalone mode on windows?

I've tried changing log4j.properties.template to log4j.properties inside hadoop-home/conf but spark still does not pick it up. I've tried setting

sparkconf.set("log4j.configuration", ".\\config\\log4j.properties");

but that doesn't work either. I also tried adding

-Dlog4j.configuration=.\config\log4j.properties

to eclipse run configuration but doesn't work. Spark is still using its default during startup

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

I also set SPARK_CONF_DIR to my environmental variable to point to the spark/conf dir but that doesn't seem to work neither.

I am running this in windows standalone mode in eclipse

SparkConf sparkConf = new SparkConf().setAppName("Test").setMaster("local[1]")
                .set("log4j.configuration", ".\\config\\log4j.properties");

Upvotes: 2

Views: 4100

Answers (4)

user778806
user778806

Reputation: 77

I had the problem from scala code (no problem with spark-shell for which log4j.properties was working properly). For lack of time and given that I was only doing little experimentation locally to my laptop and did not want to change many little didactic programs I went for a very dirty and quick hack: modified the file log4j-defaults.properties inside the jar kindly indicated by the first diagnostic message:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

Obviously I feel guilty, hope my children will never know about this :-), but it was quick and easy (and was only working locally on non-production code)

Upvotes: 0

bluenote10
bluenote10

Reputation: 26600

To slightly extend @ThatiK's very helpful answer: If you want to load your logging properties file from a resource instead of a plain file, you can do something like this (source):

import java.util.Properties
import org.apache.log4j.PropertyConfigurator

val props = new Properties()
props.load(getClass.getClassLoader.getResourceAsStream("conf/log4j.properties"))
PropertyConfigurator.configure(props)    

Upvotes: 0

ThatiK
ThatiK

Reputation: 86

I have encountered a similar issue using

sparkconf.set("log4j.configuration", "path to log4j.properties");

The workaround would be using

import org.apache.log4j.PropertyConfigurator

PropertyConfigurator.configure("path to log4j.properties")

Upvotes: 7

vijay kumar
vijay kumar

Reputation: 2049

To check wheter log4j.propeties is being loaded are not try checking using spark-shell for confirmation. (Spark-shell internally calls Spark-submit, where spark-submit is the launch-pad of all spark jobs )

Before copy (Note: line-1 saying log4j-defaults in use)

C:\Users\ramisetty\Desktop>spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/08/07 13:01:10 INFO SecurityManager: Changing view acls to: ramisetty
15/08/07 13:01:10 INFO SecurityManager: Changing modify acls to: ramisetty

After copying log4j.properties.template to log4j.properties ( means: log4j.defaults is ignored and log4j.properties is being picked )

C:\Users\ramisetty\Desktop>spark-shell
15/08/07 13:12:37 INFO SecurityManager: Changing view acls to: ramisetty
15/08/07 13:12:37 INFO SecurityManager: Changing modify acls to: ramisetty

To check even more precise.

change log4j.rootCategory=INFO, console in log4j.properties to log4j.rootCategory=WARN, console and observe changes in logging....

Upvotes: 0

Related Questions