Mohammad Sunny
Mohammad Sunny

Reputation: 69

Set up Apache Spark with Yarn Cluster

I want to integrate Yarn using apache spark.I have installed spark , jdk and scala on my pc. My data is saved in Cassandra database.I have also created one another server for slave.

Spark version - 2.1.0 Scala version - 2.9.2 master(My PC) (IP :192...01) slave server(IP :192...02)

Spark and scala also installed on my slave server. Do I need to install anything more on master or slave? If all installed then what should I do configuration for integrating YARN using spark.

Actually I am creating a word count program using cluster manager(YARN). My aim is to used YARN in my application.You can give me any more suggestion for that. Please help..

Upvotes: 3

Views: 149

Answers (1)

Subash
Subash

Reputation: 895

  • You need to install Hadoop 2x to have yarn incorporated.Here is the link.
  • Next,launch spark with Yarn in cluster mode $ ./bin/spark-submit --class path.to.your.Class --master yarn --deploy-mode cluster [options] <app jar> [app options].You can refer here

Upvotes: 1

Related Questions