Jiang Xiang
Jiang Xiang

Reputation: 3256

scalac compile yields "object apache is not a member of package org"

My code is:

import org.apache.spark.SparkContext

It can run in interactive mode, but when I use scalac to compile it, I got the following error message:

object apache is not a member of package org

This seems to be the problem of path, but I do not know exactly how to configure the path.

Upvotes: 41

Views: 97157

Answers (5)

WattsInABox
WattsInABox

Reputation: 4636

I had this issue because I had the wrong scope for my spark dependency in my pom.xml file. This is wrong:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>${spark.version}</version>
  <scope>test</scope> <!-- will not be available during compile phase -->
</dependency>

Changing test -> provided will work and will not include spark in your uberjar which is what you will almost certainly want:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.11</artifactId>
  <version>${spark.version}</version>
  <scope>provided</scope>
</dependency>

Upvotes: 5

ramana mavuluri
ramana mavuluri

Reputation: 39

I have had the same issue when running scala wordcount program on spark frame work. I was using eclipse as IDE and maven as build tool. I just removed the scope of spark frame work in POM file from "test"-->"provided", like below. It worked. provided

Upvotes: 0

Rahul Goyal
Rahul Goyal

Reputation: 774

I was facing this issue in the sbt interactive session.

Resolved the is by simple executing reload in the session.

Hope this helps!

Upvotes: 2

B-Tron of the Autobots
B-Tron of the Autobots

Reputation: 539

One easy way (if you're using the Play Framework) is to look up the LibraryDependacy in the Maven Repository, choose the version, choose SBT and then add it to the bottom of your project/build.sbt file, like so:

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.2"

Afterwards, you'll want to enter reload into the sbt console and then compile. This can feel a little foreign if you're coming from pip or js but the Maven Repo is your friend.

Upvotes: 3

tgpfeiffer
tgpfeiffer

Reputation: 1838

You need to specify the path of libraries used when compiling your Scala code. This is usually not done manually, but using a build tool such as Maven or sbt. You can find a minimal sbt setup at http://spark.apache.org/docs/1.2.0/quick-start.html#self-contained-applications

Upvotes: 26

Related Questions