scalacode
scalacode

Reputation: 1106

run spark locally with intellij

I wrote this :

import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession

object ProcessingApp extends App {
  val sparkConf = new SparkConf()
    .setAppName("er")
    .setMaster("local")
  val sparkSession: SparkSession = SparkSession.builder().config(sparkConf).getOrCreate()

  val test = sparkSession.version

  println(test)

}

I want to run it locally with my Intellij IDE by right click on the run ProcessingApp but this doesn't work , I made my spark dependencies not provided at the build.sbt file level. I am getting this error:

Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass

Upvotes: 4

Views: 3767

Answers (2)

Chitral Verma
Chitral Verma

Reputation: 2855

change the scope of all the spark dependencies from provided to compile

Upvotes: 10

Chida
Chida

Reputation: 1

Try to right click on the jar file in the target directory and run it. If the dependencies are included in your jar, it should pick it.

Upvotes: 0

Related Questions