Reputation: 259
My Spark environment
Spark -> 2.1.0
Hadoop -> 2.8.1
Eclipse -> Neon 2
I stuck while getting the spark context in yarn mode.How I can get spark context in yarn mode,Please help me to resolve.
My Hadoop,Yarn and Spark installation is successful.
$ jps
3200 NameNode
5264 ExecutorLauncher
5328 CoarseGrainedExecutorBackend
3555 SecondaryNameNode
5316 CoarseGrainedExecutorBackend
7590 Jps
3357 DataNode
4045 NodeManager
5118 SparkSubmit
3727 ResourceManager
My source code to get spark context in yarn mode :-
public class JavaClient {
public static void main(String[] args) {
// TODO Auto-generated method stub
SparkConf conf = new SparkConf().setAppName("SparkTest").setMaster("yarn-client");
SparkSession spark = SparkSession.builder().config(conf).getOrCreate();
System.out.println(spark.version() + " : " + spark.sparkContext());
}
}
Output :-
17/09/22 10:24:11 INFO Client: Application report for application_1506052073594_0011 (state: ACCEPTED)
17/09/22 10:24:12 INFO Client: Application report for application_1506052073594_0011 (state: ACCEPTED)
17/09/22 10:24:13 INFO Client: Application report for application_1506052073594_0011 (state: ACCEPTED)
.
.
.
and so on , not returning spark session.
Upvotes: 0
Views: 380
Reputation: 2057
Check in Resource Manager UI (localhost:8088) that you have available memory/cpu slots for Application Master and executors
Upvotes: 2
Reputation: 395
Try just "yarn" instead of "yarn-client"
public class JavaClient {
public static void main(String[] args) {
// TODO Auto-generated method stub
SparkConf conf = new SparkConf().setAppName("SparkTest").setMaster("yarn");
SparkSession spark = SparkSession.builder().config(conf).getOrCreate();
System.out.println(spark.version() + " : " + spark.sparkContext());
}
Upvotes: 0