Prakul Singhal
Prakul Singhal

Reputation: 41

how to submit custom log4j.xml file using spark-submit on multi-node cluster using HDFS with spark standalone cluster manager

I am submitting java jar using spark-submit to spark standalone cluster manager. But I am not able to provide custom log4j.xml file to it. If i used --files option then i need to copy that log4j file to all machines at that same position or if I give it using hdfs:// path then it is not taking that file as log4j and switch to default log4j file.

I also used -Dlog4j.configuration with both local and hdfs path but same issue is happening as using --files.

Please help me if someone know the solution

Thanks in advance.

Upvotes: 1

Views: 1247

Answers (1)

Spark does not support writing logs in HDFS using log4j rather you can write the logs local in your unix box.

The porperties for specifying log4j in spark-submit command are:

--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration= Location of your log4j.properties file"

--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration= location of your log4j.properties file"

You have to create a custom log4j.properties file not log4j.xml in your local unix box.

If you want to create custom logs in HDFS you can create a interface in java or trait in scala to specify the loging levels and write the log to HDFS for more reference you can check this question.

Upvotes: 1

Related Questions