pythonic
pythonic

Reputation: 21625

How to check Spark configuration from command line?

Basically, I want to check a property of Spark's configuration, such as "spark.local.dir" through command line, that is, without writing a program. Is there a method to do this?

Upvotes: 4

Views: 26873

Answers (5)

Ankit
Ankit

Reputation: 53

Master command to check spark config from CLI

sc._conf.getAll()

Upvotes: 1

Narender Bhadrecha
Narender Bhadrecha

Reputation: 97

We can check in Spark shell using below command :

scala> spark.conf.get("spark.sql.shuffle.partitions")
res33: String = 200

Upvotes: 4

vaquar khan
vaquar khan

Reputation: 11449

Following command print your conf properties on console

 sc.getConf.toDebugString

Upvotes: 5

Nishu Tayal
Nishu Tayal

Reputation: 20840

There is no option of viewing the spark configuration properties from command line.

Instead you can check it in spark-default.conf file. Another option is to view from webUI.

The application web UI at http://driverIP:4040 lists Spark properties in the “Environment” tab. Only values explicitly specified through spark-defaults.conf, SparkConf, or the command line will appear. For all other configuration properties, you can assume the default value is used.

For more details, you can refer Spark Configuration

Upvotes: 5

Przemek
Przemek

Reputation: 228

Based on http://spark.apache.org/docs/latest/configuration.html. Spark provides three locations to configure the system:

  • Spark properties control most application parameters and can be set by using a SparkConf object, or through Java system properties.
  • Environment variables can be used to set per-machine settings, such the IP address, through the conf/spark-env.sh script on each node.

  • Logging can be configured through log4j.properties.

I haven't heard about method through command line.

Upvotes: 2

Related Questions