SidiAli
SidiAli

Reputation: 149

Why does compiling Spark application from within sbt session fail with "object apache is not a member of package org"?

I've created a simple file hello.scala containing the following code:

import org.apache.spark.SparkContext

object HelloSbt extends App {
  println("Welcome to this thing!")
}

Please note that I'm not using an IDE.

When I run the command compile I get this error:

object apache is not a member of package org

The code without the import line works fine.

I looked online and they say I should modify build.sbt and add some lines to it. build.sbt looks as follows:

name := "tasky"
version := "0.1.0"
scalaVersion := "2.12.3"
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "0.9.0-incubating",
  "org.apache.spark" %% "spark-streaming" % "0.9.0-incubating",
  "org.apache.spark" %% "spark-streaming-twitter" % "0.9.0-incubating")

compile keeps failing with the error though. Why?

Upvotes: 0

Views: 131

Answers (1)

Jacek Laskowski
Jacek Laskowski

Reputation: 74619

After you've started sbt and are in sbt shell, any changes to build.sbt are not picked up by the shell until you do reload (that will reload any changes made since the session was started).

sbt:so> help reload
reload

    (Re)loads the project in the current directory.

reload plugins

    (Re)loads the plugins project (under project directory).

reload return

    (Re)loads the root project (and leaves the plugins project).

There are two issues with your build.sbt.

  1. You should use the latest and greatest 2.2.0 as the version of the Spark dependencies.

  2. Spark supports Scala 2.11 and so scalaVersion should be 2.11.8 (up to 2.11.11).

Upvotes: 1

Related Questions