Reputation: 21625
Basically, I want to check a property of Spark's configuration, such as "spark.local.dir" through command line, that is, without writing a program. Is there a method to do this?
Upvotes: 4
Views: 26873
Reputation: 97
We can check in Spark shell using below command :
scala> spark.conf.get("spark.sql.shuffle.partitions")
res33: String = 200
Upvotes: 4
Reputation: 11449
Following command print your conf properties on console
sc.getConf.toDebugString
Upvotes: 5
Reputation: 20840
There is no option of viewing the spark configuration properties from command line.
Instead you can check it in spark-default.conf file. Another option is to view from webUI.
The application web UI at http://driverIP:4040 lists Spark properties in the “Environment” tab. Only values explicitly specified through spark-defaults.conf, SparkConf, or the command line will appear. For all other configuration properties, you can assume the default value is used.
For more details, you can refer Spark Configuration
Upvotes: 5
Reputation: 228
Based on http://spark.apache.org/docs/latest/configuration.html. Spark provides three locations to configure the system:
Environment variables can be used to set per-machine settings, such the IP address, through the conf/spark-env.sh script on each node.
Logging can be configured through log4j.properties.
I haven't heard about method through command line.
Upvotes: 2