Reputation: 86
I'm trying to execute a Spark/Scala code snippet (given below) in Eclipse. I have created a Maven project for it but I'm getting the following error when I try to run the code:
not found: type SparkConf
My code is:
package extraction
import org.apache.spark._
import org.apache.spark.SparkConf
object JsonParser {
val conf = new SparkConf().setAppName("Spark json extract")
conf.setMaster("local");
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
def main(args: Array[String]): Unit = {
val df = sqlContext.read.json("F:\\test1.json")
df.registerTempTable("jsonExtract")
val data = sqlContext.sql("select * from jsonExtract")
data.show();
sc.stop
}
}
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>JSON</groupId>
<artifactId>JSON</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src</sourceDirectory>
<resources>
<resource>
<directory>src</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
How I can fix this error? Is it not possible to build this project in Eclipse?
Upvotes: 1
Views: 1038
Reputation: 1149
I think your <dependencies> </dependencies>
tag is missing. See Maven POM
Edit:
It could be also repository issue. Check you have the right libs in your repos:
Upvotes: 1