Reputation: 1259
I am getting this error when is want to run SparkPi example.
beyhan@beyhan:~/spark-1.2.0-bin-hadoop2.4$ /home/beyhan/spark-1.2.0-bin-hadoop2.4/bin/spark-submit --master ego-client --class org.apache.spark.examples.SparkPi /home/beyhan/spark-1.2.0-bin-hadoop2.4/lib/spark-examples-1.jar
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Error: Master must start with yarn, spark, mesos, or local
Run with --help for usage help or --verbose for debug output
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Also i already start my master via another terminal
>./sbin/start-master.sh
starting org.apache.spark.deploy.master.Master, logging to /home/beyhan/spark-1.2.0-bin-hadoop2.4/sbin/../logs/spark-beyhan-org.apache.spark.deploy.master.Master-1-beyhan.out
Any suggestion ? Thanks.
Upvotes: 3
Views: 9830
Reputation: 8487
Download and extract Spark:
$ cd ~/Downloads
$ wget -c http://archive.apache.org/dist/spark/spark-1.2.0/spark-1.2.0-bin-hadoop2.4.tgz
$ cd /tmp
$ tar zxf ~/Downloads/spark-1.2.0-bin-hadoop2.4.tgz
$ cd spark-1.2.0-bin-hadoop2.4/
Start master:
$ sbin/start-master.sh
Find master's URL from logs in the file that above command printed. Lets assume that master is: spark://ego-server:7077
In this case, you can also find your master url by visiting this URL: http://localhost:8080/
Start one slave, and connect it to master:
$ sbin/start-slave.sh --master spark://ego-server:7077
Another way to ensure that master up and running start a shell bound to that master:
$ bin/spark-submit --master "spark://ego-server:7077"
If you get a spark shell, then everything seems fine.
Now execute your job:
$ find . -name "spark-example*jar"
./lib/spark-examples-1.2.0-hadoop2.4.0.jar
$ bin/spark-submit --master "spark://ego-server:7077" --class org.apache.spark.examples.SparkPi ./lib/spark-examples-1.2.0-hadoop2.4.0.jar
Upvotes: 4
Reputation: 1113
The error you're getting
Error: Master must start with yarn, spark, mesos, or local
Means that --master ego-client
is not recognized by spark.
Use
--master local
for a local execution of spark or
--master spark://your-spark-master-ip:7077
Upvotes: 3