Reputation: 192
I'm getting the following exception when I'm trying to submit a Spark application to a Mesos cluster:
17/01/31 17:04:21 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/31 17:04:22 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Could not parse Master URL: 'mesos://localhost:5050' at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2550) at org.apache.spark.SparkContext.(SparkContext.scala:501)
Upvotes: 4
Views: 1184
Reputation: 20826
You probably used a wrong command to build Spark, e.g., missing -Pmesos
. You should build it using ./build/mvn -Pmesos -DskipTests clean package
since Spark 2.1.0.
Upvotes: 4