Raj
Raj

Reputation: 2398

How to get default property values in Spark

I am using this version of Spark : spark-1.4.0-bin-hadoop2.6 . I want to check few default properties. So I gave the following statement in spark-shell

scala> sqlContext.getConf("spark.sql.hive.metastore.version")

I was expecting the call to method getConf to return a value of 0.13.1 as desribed in this link. But I got the below exception

java.util.NoSuchElementException: spark.sql.hive.metastore.version
    at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
    at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)

Am I retrieving the properties in the right way?

Upvotes: 6

Views: 12365

Answers (2)

praveenak
praveenak

Reputation: 400

In Spark 2.x.x If I wanted to know default value of a Spark Conf I would do this:

Below command will return a Scala Map in spark-shell.

spark.sqlContext.getAllConfs 

To find our value for a conf property:

e.g. - To find the default warehouse dir used by spark set to conf - spark.sql.warehouse.dir:

spark.sqlContext.getAllConfs.get("spark.sql.warehouse.dir")

Upvotes: 5

Justin Pihony
Justin Pihony

Reputation: 67115

You can use

sc.getConf.toDebugString

OR

sqlContext.getAllConfs

which will return all values that have been set, however some defaults are in the code. In your specific example, it is indeed in the code:

getConf(HIVE_METASTORE_VERSION, hiveExecutionVersion)

where the default is indeed in the code:

val hiveExecutionVersion: String = "0.13.1"

So, getConf will attempt to pull the metastore version from the config, falling back to a default, but this is not listed in the conf itself.

Upvotes: 7

Related Questions