Reputation: 83
I am trying to run living spark session using spring boot. My aim is to run spark in Yarn mode with springboot.
SparkConf conf = new SparkConf(). set("spark.driver.extraJavaOptions", "Dlog4j.configuration=file://src/main/resources/log4j.properties"). set("spark.executor.extraJavaOptions","Dlog4j.configuration=file://src/main/resources/log4j.properties"). set("yarn.resoursemanager.address","http://my-yarn-host"). set("spark.yarn.jars","BOOT-INF/lib/spark-*.jar"). setAppName("NG-Workbench").setMaster("yarn"); JavaSparkContext sc = new JavaSparkContext(conf); List<String> word = new ArrayList<>(); word.add("Sidd"); JavaRDD<String> words = sc.parallelize(Arrays.asList("Michel", "Steve")); Map<String, Long> wordCounts = words.countByValue(); wordCounts.forEach((k, v) -> System.out.println(k + " " + v)); sc.close();
Upvotes: 2
Views: 735
Reputation: 1771
i would suggest you to add some configuations files to your artifact:
otherwise you can add these 2 properties in your spark-conf :
Upvotes: 0