Daniel Kats
Daniel Kats

Reputation: 5554

Apache Spark: java.lang.NoSuchMethodError .rddToPairRDDFunctions

sbt package runs just fine, but after spark-submit I get the error:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.rddToPairRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions; at SmokeStack$.main(SmokeStack.scala:46) at SmokeStack.main(SmokeStack.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Here is the offending line:

val sigCounts = rowData.map(row => (row("Signature"), 1)).countByKey()

rowData is an RDD Map[String, String]. "Signature" key exists in all items in the map.

I suspect this may be a build issue. Below is my sbt file:

name := "Example1"
version := "0.1"
scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
scalacOptions ++= Seq("-feature")

I'm new to Scala so maybe the imports are not correct? I have:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import scala.io.Source

Upvotes: 18

Views: 34151

Answers (3)

Jake
Jake

Reputation: 4650

If updating the version of one spark dependency, it is safest to update them all to the same version

Upvotes: 1

Shrikant Prabhu
Shrikant Prabhu

Reputation: 735

I Was facing the same problem while reading a simple oneline json file into a dataframe and showing it using .show() method. I would get this error on myDF.show() line of code.

For me it turned out to be wrong version of spark-sql library in the build.

i.e. I was having in my External Libraries from SBT , instead of .

Adding following line to my build.sbt resolved the issue

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

Upvotes: 2

maasg
maasg

Reputation: 37435

java.lang.NoSuchMethodError is often an indication that the version the code was compiled against is on a higher version than the libraries used at runtime.

With Spark, that means that the Spark version used to compile is different from the one deployed (on the machine or cluster).

Aligning the versions between development and runtime should solve this issue.

Upvotes: 34

Related Questions