Satyanvesh Muppaneni
Satyanvesh Muppaneni

Reputation: 225

How to set up Spark on multi-node Hadoop cluster?

I would like to install Hadoop HDFS and Spark on multi-node cluster.

I was able to successfully install and configure Hadoop on multi-node cluster. I have also installed and configured Spark on master node.

I have doubts that I have to configure the spark in slaves as well?

Upvotes: 1

Views: 1090

Answers (1)

Jacek Laskowski
Jacek Laskowski

Reputation: 74669

I have doubt that I have to configure the spark in slaves as well?

You should not. You're done. You did more than you had to to submit Spark applications to Hadoop YARN (which I concluded is the cluster manager).

Spark is a library for distributed computations on massive datasets and as such it belongs solely to your Spark applications (not any cluster you may use).

Time to spark-submit Spark applications!

Upvotes: 3

Related Questions