Ego
Ego

Reputation: 585

Option for specifying Spark environment API when using Spark Shell

Is there an option you can pass to the spark-shell that specifies what environment you will be running your code against? In other words, if I am using Spark 1.3; can I specify that I wish to use the Spark 1.2 API ?

For example:

pyspark --api 1.2

Upvotes: 1

Views: 79

Answers (1)

Ashrith
Ashrith

Reputation: 6855

spark-shell initializes org.apache.spark.repl.Main to start REPL, which does not parse any command line arguments. Hence no it will not be possible to pass api value from command line, you have use respective spark-shell binary from their respective versions of spark.

Upvotes: 2

Related Questions