Venki Venkatesh
Venki Venkatesh

Reputation: 33

Apache Spark - UDF doesn't seem to work with spark-submit

I am unable to get UDF to work with spark-submit. I don't have any problem while using spark-shell.

Please see below, the Error message, sample code, build.sbt and the command to run the program

Will appreciate all the help! - Regards, Venki


ERROR message: (line 20 is where the UDF is defined)

Exception in thread "main" java.lang.NoSuchMethodError:
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)
Lscala/reflect/api/JavaUniverse$JavaMirror;
at TryUDFApp$.main(TryUDFApp.scala:20)

CODE:

/* TryUDFApp.scala */

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.types._

object TryUDFApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
// print "Hello world"
println("Hello World -- I am trying to use UDF!")
// Create a UDF
val tryUDF = udf { (arg1: String, arg2: String) => arg2 + arg1 }
}
}

build.sbt

name := "TryUDFApp Project"
version := "1.0"
scalaVersion := "2.11.7"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.6.1",
    "org.apache.spark" %% "spark-sql"  % "1.6.1"
)

Command to run the code:

$SPARK_HOME/bin/spark-submit --class "TryUDFApp" --master local[4] $TADIR/target/scala-2.11/tryudfapp-project_2.11-1.0.jar

echo $SPARK_HOME

/Users/venki/Spark/spark-1.6.1-bin-hadoop2.6

Upvotes: 3

Views: 1871

Answers (1)

Alfredo Gimenez
Alfredo Gimenez

Reputation: 2224

When you see a NoSuchMethod or ClassNotFound regarding a scala library (in this case, scala.reflect.api.JavaUniverse.runtimeMirror), this usually means a mismatch of scala versions happened somewhere.

You're using spark 1.6.1, which comes pre-built for scala 2.10, but your project is scala 2.11.7, hence the error.

Your options are:

  1. Downgrade your project to 2.10
  2. Build Spark 1.6.1 with 2.11 support (from source)
  3. Use Spark 2.0, which comes pre-built with 2.11 support

Upvotes: 3

Related Questions