Reputation: 42050
I am completely new at Spark and try to run a tutorial example, which counts the number of lines containing 'a' and 'b' in a text file in the local file system.
I am running it with SparkContext
with master = "local"
, i.e. Spark
is running in the same JVM. Now I would like to try it in "cluster mode".
So I would like to run a Spark cluster of a cluster manager and two worker nodes locally on my Mac laptop. What is the easiest way to do that ?
Upvotes: 0
Views: 1992
Reputation: 5480
If you are looking to learn various ways to use SPARK
I would suggest you to download the CLOUDERA
quick start VM's which will give a simple cluster setup.
All you need to do is download the quick start
VM and play around with the settings accordingly.
The quick start
VM can be found here
Reference:Cloudera VM
Upvotes: 1
Reputation: 74619
Quoting the official documentation about Spark Standalone Mode:
./sbin/start-master.sh ./sbin/start-slave.sh <master-spark-URL>
In other words, you should start the standalone Master first (using ./sbin/start-master.sh
) followed by starting one or more standalone Workers (using ./sbin/start-slave.sh
).
Quoting the docs again:
Once you have started a worker, look at the master's web UI (http://localhost:8080 by default)
You're done. Congrats!
Upvotes: 2