Luke Hewitt
Luke Hewitt

Reputation: 91

Bad symbolic reference when building and including spark

I'm working on something which involves extending the Spark project. I'm using Spark's make-distribution.sh, and successfully building the jars from the code on github. However, when I include the jars as a dependency and compile the this in IntelliJ:

package org.apache.spark
object Main extends App{
    println(org.apache.spark.util.Utils.random.nextInt())
}

I get a compilation error:

Error:scalac: bad symbolic reference. A signature in Utils.class refers to term util in package com.google.common which is not available. It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling Utils.class.

Can anybody advise me as to what's going wrong here? Thanks for any help! -Luke

Upvotes: 0

Views: 788

Answers (2)

Sathish
Sathish

Reputation: 5173

Not sure whether it will be helpful now (as its old post), however, this issue is fixed in latest version of Spark - https://issues.apache.org/jira/browse/SPARK-5466

Upvotes: 0

suztomo
suztomo

Reputation: 5202

Error message says your scalac cannot find com.google.common package.util:

Utils.class depends on "util" in a package "com.google.common".

The com.google.common is not available for your scalac.

The package may be completely missing from the current classpath.

or the version on the classpath might be incompatible with the version used when compiling Utils.class.

Check how you resolve dependency of sparc project.

Upvotes: 0

Related Questions