Durga Viswanath Gadiraju
Durga Viswanath Gadiraju

Reputation: 3956

spark-submit is not exiting until I hit ctrl+C

I am running this spark command to run spark Scala program successfully using Hortonworks vm. But once the job is completed it is not exiting from spark-submit command until I hit ctrl+C. Why?

spark-submit --class SimpleApp --master yarn-client --num-executors 3 --driver-memory 512m --executor-memory12m --executor-cores 1 target/scala-2.10/application_2.10-1.0.jar /user/root/decks/largedeck.txt 

Here is the code, I am running.

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf) 
    val cards = sc.textFile(args(0)).flatMap(_.split(" "))
    val cardCount = cards.count()
    println(cardCount)
  }
}

Upvotes: 3

Views: 2631

Answers (2)

Aravind Krishnakumar
Aravind Krishnakumar

Reputation: 2777

I had the same kind of problem when writing files to S3. I use the spark 2.0 version, even after adding stop() if it didn't work for you. Try the below settings

In Spark 2.0 you can use,

val spark = SparkSession.builder().master("local[*]").appName("App_name").getOrCreate()
spark.conf.set("spark.hadoop.mapred.output.committer.class","com.appsflyer.spark.DirectOutputCommitter")
spark.conf.set("mapreduce.fileoutputcommitter.marksuccessfuljobs", "false")

Upvotes: 1

Atul Soman
Atul Soman

Reputation: 4720

You have to call stop() on context to exit your program cleanly.

Upvotes: 7

Related Questions