Mohitt
Mohitt

Reputation: 2977

Spark: spark-submit won't take custom log4j.properties

This could be possible duplicate of old post, but that was relevant to Spark 1.3/4. I am using 1.5.2.

I am packaging log4j.properties in my fat jar. It is showing different behavior in two scenarios. On ubuntu machine 1, I have spark in a user directory. On similar machine 2, I have it system installed. On machine 2 a default log4j.properties is available at : /etc/spark/conf/log4j.properties.

On machine 2, the packaged log4j.properties in the fat-jar is not getting loaded, but it does on machine 1. I am using the same command on both machines:

spark-submit 
--master local[1]   
--class com.myCompany.myMainClass  myFat.jar

Based on spark documentation, I am though able to do it forcefully by providing it from outside by:

spark-submit 
--master local[1]
--driver-java-options "-Dlog4j.configuration=file:///mnt1/mohit/log4j.properties"
--class com.myCompany.myMainClass 
myFat.jar 

Why the behavior differs? How can I make spark-submit to use the packaged file?

Upvotes: 2

Views: 2655

Answers (1)

imriqwe
imriqwe

Reputation: 1455

The documentation you are referring to is intended for YARN, but you run on local mode.

Try specifying it as follows:

spark-submit 
--master local[1]
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=/mnt1/mohit/log4j.properties"
--class com.myCompany.myMainClass 
myFat.jar 

Upvotes: 2

Related Questions