Reputation: 61
I am new to Spark, and I am trying to submit my spring spark application to yarn cluster. The spark config is initialized in the spring, but it is not getting the yarn detail while submitting, and it always points to local. I know am missing out some config to be set.
The code used is shown below:
SparkConf sparkconf = new SparkConf().setAppName("app name")
.set("spark.port.maxRetries", "100")
.set("spark.ui.port", "4060")
.set("spark.executor.memory", "7g")
.set("spark.executor.cores", "2")
.set("SPARK_YARN_MODE", "true")
.setSparkHome("spark home directory")
.set("SPARK_JAR_HDFS_PATH", "directory of spark-assembly.jar")
.set("SPARK_CONF_DIR", "config directory")
.setMaster("yarn-client");
Log as below where it tries to run as in local mode, [o.a.h.y.c.RMProxy:56] Connecting to ResourceManager at /0.0.0.0:8032
Config used,
conf.addResource(new Path(filepath+ "/hbase-site.xml"));
conf.addResource(new Path(filepath+ "/core-site.xml"));
conf.addResource(new Path(filepath+ "/hdfs-site.xml"));
conf.addResource(new Path(filepath+ "/yarn-site.xml"));
Upvotes: 1
Views: 472
Reputation: 61
The reason was because the config files mentioned in the resource path was not picked up properly, esp the yarn-site.xml file, so did clean & rebuild the project to get the issue resolved.
Upvotes: 0