Oren
Oren

Reputation: 1906

Zeppelin and Spark Configuration

I'm working with Zeppelin (0.7.1) on Spark (2.1.1) on my localhost, and trying to add some configuration values to the jobs I run.

Specifically, I'm trying to set the es.nodes value for elasticsearch-hadoop.

I tried adding the key and value to the interpreter configuration, but that didn't show up in sc.getConf. Adding to the interpreter's "args" configuration key the value of "--conf mykey:myvalue" didn't register as well. Is that not what the spark interpreter configuration is supposed to do?

Upvotes: 1

Views: 3174

Answers (1)

Oren
Oren

Reputation: 1906

Apparently this is an intentional change in Zeppelin, implemented not long ago... It only allows spark.* properties to be delegated to the SparkConf. I have submitted a comment to change this, as I believe it is problematic. https://github.com/apache/zeppelin/pull/1970

Upvotes: 2

Related Questions