user2953788
user2953788

Reputation: 167

SparkContext.setLogLevel("DEBUG") doesn't works in Cluster

I'm trying to control my Spark logs use sc.setLogLevel("ERROR"); seems like it doesn't work in the cluster environment. Can anyone help?

public static JavaSparkContext getSparkContext(String appName, SparkConf conf) {
    SparkSession spark = getSparkSession(appName, conf);
    JavaSparkContext sc = new JavaSparkContext(spark.sparkContext());
    sc.setLogLevel("WARN");
    return sc;
}

Upvotes: 1

Views: 4147

Answers (1)

Puneet Singh
Puneet Singh

Reputation: 312

To configure log levels, add the following options to your spark submit command:

'--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=custom-log4j.properties"'

'--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=custom-log4j.properties"'

This assumes you have a file called custom-log4j.properties on the classpath. This log4j can then control the verbosity of spark's logging.

Upvotes: 2

Related Questions