Paul Reiners
Paul Reiners

Reputation: 7886

Scala/Spark: NoClassDefFoundError: net/liftweb/json/Formats

I am trying to create a JSON String from a Scala Object as described here.

I have the following code:

import scala.collection.mutable._
import net.liftweb.json._
import net.liftweb.json.Serialization.write

case class Person(name: String, address: Address)
case class Address(city: String, state: String)

object LiftJsonTest extends App {

  val p = Person("Alvin Alexander", Address("Talkeetna", "AK"))

  // create a JSON string from the Person, then print it
  implicit val formats = DefaultFormats
  val jsonString = write(p)
  println(jsonString)

}

My build.sbt file contains the following:

libraryDependencies += "net.liftweb" %% "lift-json" % "2.5+"

When I build with sbt package, it is a success.

However, when I try to run it as a Spark job, like this:

spark-submit \
  --packages com.amazonaws:aws-java-sdk-pom:1.10.34,org.apache.hadoop:hadoop-aws:2.6.0,net.liftweb:lift-json:2.5+ \
  --class "com.foo.MyClass" \
  --master local[4] \
  target/scala-2.10/my-app_2.10-0.0.1.jar

I get this error:

Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: net.liftweb#lift-json;2.5+: not found]
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1068)
    at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

What am I doing wrong here? Is net.liftweb:lift-json:2.5+ in my packages argument incorrect? Do I need to add a resolver in build.sbt?

Upvotes: 1

Views: 1781

Answers (1)

Alexey Romanov
Alexey Romanov

Reputation: 170713

Users may also include any other dependencies by supplying a comma-delimited list of maven coordinates with --packages.

2.5+ in your build.sbt is Ivy version matcher syntax, not actual artifact version needed for Maven coordinates. spark-submit apparently doesn't use Ivy for resolution (and I think it would be surprising if it did; your application could suddenly stop working because a new dependency version was published). So you need to find what version 2.5+ resolves to in your case e.g. using https://github.com/jrudolph/sbt-dependency-graph (or trying to find it in show dependencyClasspath).

Upvotes: 1

Related Questions