Spider
Spider

Reputation: 1470

Failed to start master for Spark in Windows

Same problem as Failed to start master for spark in windows 10 which is also not solved.

My spark is working well by testing pyspark.cmd and spark-shell.cmd

After runing .\sbin\start-master.sh I got:

ps: unknown option -- o
Try 'ps --help' for more information.
starting org.apache.spark.deploy.master.Master, logging to C:\spark-1.6.1-bin-hadoop2.6/logs/spark--org.apache.spark.deploy.master.Master-1-%MY_USER_NAME%-PC.out
ps: unknown option -- o
Try 'ps --help' for more information.
failed to launch org.apache.spark.deploy.master.Master:
  ========================================
  Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M
full log in C:\spark-1.6.1-bin-hadoop2.6/logs/spark--org.apache.spark.deploy.master.Master-1-%MY_USER_NAME%-PC.out

I tried to visit web UI, while the localhost:4040 is working the localhost:8080 cannot be reached.

And I found there is the .log file created at the folder of %SPARK_HOME%/logs . They contains same content:

Spark Command:

C:\Program Files\Java\jdk1.7.0_79\bin\java -cp C:\spark-1.6.1-bin-hadoop2.6/conf\;C:\spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-api-jdo-3.2.6.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-core-3.2.10.jar;C:\spark-1.6.1-bin-hadoop2.6\lib\datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip hahaha-PC --port 7077 --webui-port 8080

========================================
Picked up _JAVA_OPTIONS: -Xmx512M -Xms512M

Working environment: Spark: 1.6.1 Windows 10

Looking forward to your reply and thanks for your time so much!

Upvotes: 22

Views: 27343

Answers (5)

Kaimenyi
Kaimenyi

Reputation: 121

If you are looking to start the master worker and slaves this should work for you. it works for me

  1. To start master worker open windows command prompt on the spark/bin directory then copy and paste this command and hit enter
spark-class org.apache.spark.deploy.master.Master

You will need to point your browser to http://localhost:8080/. If you get an error message "server not found" refresh page. From this page you will get your unique url. looks like this URL: spark://192.xxx.xx.xxx:7077

  1. open a new terminal and go the %SPARK_HOME%/bin,copy and paste this line of code and hit enter.
spark-class org.apache.spark.deploy.worker.Worker spark://ip:port

This part

spark://ip:port
is the URL obtained from step 1. Refresh the browser tab opened in step one to see if the worker has started.

NOTE: JDK 1.9 is not supported

Upvotes: 10

Abhishek Chaurasia
Abhishek Chaurasia

Reputation: 876

The launch scripts located at %SPARK_HOME%\sbin do not support Windows. You need to manually run the master and worker as outlined below.

  1. Go to %SPARK_HOME%\bin folder in a command prompt

  2. Run spark-class org.apache.spark.deploy.master.Master to run the master. This will give you a URL of the form spark://ip:port

  3. Run spark-class org.apache.spark.deploy.worker.Worker spark://ip:port to run the worker. Make sure you use the URL you obtained in step 2.

  4. Run spark-shell --master spark://ip:port to connect an application to the newly created cluster.

Upvotes: 76

yImI
yImI

Reputation: 181

a little trick should help. I changed JAVA_HOME path to the DOS version: c:\Progra~1\Java\jre1.8.0_131 for instance then rebooted. After this i was able to run spark-class org.apache... command mentioned above.

Upvotes: 0

Ashish Garg
Ashish Garg

Reputation: 11

After executing spark-class org.apache.spark.deploy.master.Master, just goto http://localhost:8080 to get ip:port. And then open another command shell to execute spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

Upvotes: 1

Spider
Spider

Reputation: 1470

Just found answer here: https://spark.apache.org/docs/1.2.0/spark-standalone.html

"Note: The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand."

Upvotes: 7

Related Questions