Reputation: 2230
I am learning Scala programming to write driver program for word count in Apache Spark .I am using Windows 7 and Latest Spark version 2.2.0. While executing the program getting below mentioned error.
How to fix and get result ?
name := "sample" version := "0.1" scalaVersion := "2.12.3" val sparkVersion = "2.2.0" libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.11" % sparkVersion, "org.apache.spark" % "spark-sql_2.11" % sparkVersion, "org.apache.spark" % "spark-streaming_2.11" % sparkVersion )
package com.demo.file
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.sql.SparkSession
object Reader {
def main(args: Array[String]): Unit = {
println("Welcome to Reader.")
val filePath = "C:\\notes.txt"
val spark = SparkSession.builder.appName("Simple app").config("spark.master", "local")getOrCreate();
val fileData = spark.read.textFile(filePath).cache()
val count_a = fileData.filter(line => line.contains("a")).count()
val count_b = fileData.filter(line => line.contains("b")).count()
println(s" count of A $count_a and count of B $count_b")
spark.stop()
}
}
Error
Welcome to Reader. Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:723) at org.apache.spark.SparkConf$.<init>(SparkConf.scala:571) at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala) at org.apache.spark.SparkConf.set(SparkConf.scala:92) at org.apache.spark.SparkConf.set(SparkConf.scala:81) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6$$anonfun$apply$1.apply(SparkSession.scala:905) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6$$anonfun$apply$1.apply(SparkSession.scala:905) at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:138) at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:236) at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:229) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap.foreach(HashMap.scala:138) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:905) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901) at com.demo.file.Reader$.main(Reader.scala:11) at com.demo.file.Reader.main(Reader.scala) Caused by: java.lang.ClassNotFoundException: scala.Product$class at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 18 more
Upvotes: 0
Views: 579
Reputation: 3250
Spark 2.2.0 is built and distributed to work with Scala 2.11 by default. To write applications in Scala, you will need to use a compatible Scala version (e.g. 2.11.X). And your scala version is 2.12.X. That's why it is throwing exception.
Upvotes: 4