Reputation: 119
I'm trying to run a scala application and in spark-shell this works well. But when I use spark-submit, using my class, it fails.
spark-submit --deploy-mode cluster --master yarn --class org.apache.spark.examples.SparkPi s3n://bucket/test.scala
Applicacion:
package org.apache.spark.examples
import org.apache.spark.sql.types._
import org.apache.spark.sql.SQLContext
object SparkPi {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
print("test")
}
}
Error:
Final app status: FAILED, exitCode: 10, (reason: Uncaught exception: java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi)
Upvotes: 0
Views: 1142
Reputation: 1578
Try to build jar using your test.scala source and provide it as an argument for spark-submit. In spark-submit you should specify a jar with your compiled code, not the source code itself.
Upvotes: 1