jd2050
jd2050

Reputation: 302

Maven build ERROR (Scala + Spark):object apache is not a member of package org

I read several threads about this problem on StackOverflow and other resources (1st, 2nd, 3rd), but unfortunately it doesn't help. Also almost all of them describes the same problem on SBT, not Maven.

I also checked the Scala/Spark compatibility in Spark docs (here) and it seems like I have correct versions (Scala 2.11.8 + Spark 2.2.0)

I will describes my full workflow, because I'm not sure what information could be helpful to establish the root cause.

This is the code I'm trying to build

pom.xml:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>org.example</groupId>
    <artifactId>SparkWordCount</artifactId>
    <version>1.0-SNAPSHOT</version>
    <name>SparkWordCount</name>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>

        <plugins>
            <!-- mixed scala/java compile -->
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.3.1</version>
                <executions>
                    <execution>
                        <id>compile</id>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                        <phase>compile</phase>
                    </execution>
                    <execution>
                        <id>test-compile</id>
                        <goals>
                            <goal>testCompile</goal>
                        </goals>
                        <phase>test-compile</phase>
                    </execution>
                    <execution>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.5.1</version>
                <configuration>
                    <source>1.7</source>
                    <target>1.7</target>
                </configuration>
            </plugin>
            <!-- for fatjar -->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>3.1.0</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <finalName>uber-SparkWordCount-1.0-SNAPSHOT</finalName>
                    <appendAssemblyId>false</appendAssemblyId>
                </configuration>
                <executions>
                    <execution>
                        <id>assemble-all</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>3.2.0</version>
                <configuration>
                    <archive>
                        <manifest>
                            <addClasspath>true</addClasspath>
                            <mainClass>fully.qualified.MainClass</mainClass>
                        </manifest>
                    </archive>
                </configuration>
            </plugin>
        </plugins>
    </build>
    <dependencies>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>
</project>


SparkWordCount.scala


import org.apache.spark.sql.SparkSession
import org.apache.log4j.Logger
import org.apache.log4j.Level

object SparkWordCount {
  def main(args: Array[String]) {

    Logger.getLogger("org").setLevel(Level.ERROR)

    val spark = SparkSession
      .builder()
      .appName("SparkSessionZipsExample")
      .master("local")
      .getOrCreate()

    val myRdd = spark.sparkContext.parallelize(List(1,2,3,4,5,6,7))
    myRdd.foreach(number => println("Lol, this is number = " + number))
  }
}

It works fine when I just launch the main() method:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Lol, this is number = 1
Lol, this is number = 2
Lol, this is number = 3
Lol, this is number = 4
Lol, this is number = 5
Lol, this is number = 6
Lol, this is number = 7

Then I tried to use SparkSQL DataFrame:

import org.apache.spark.sql.{DataFrame, SparkSession}
import org.apache.log4j.Logger
import org.apache.log4j.Level

object SparkWordCount {
  def main(args: Array[String]) {

    Logger.getLogger("org").setLevel(Level.ERROR)

    val spark = SparkSession
      .builder()
      .appName("SparkSessionZipsExample")
      .master("local")
      .getOrCreate()

    val airports: DataFrame = spark.read.csv("C:\\Users\\Евгений\\Desktop\\DATALEARN\\airports.csv")
    airports.show()
  }
}

And this code throws an error:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Logger
    at SparkWordCount$.main(SparkWordCount.scala:10)
    at SparkWordCount.main(SparkWordCount.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Logger
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 2 more

Process finished with exit code 1

I'm not sure how exactly it works, but I fixed this by changing my Run Configuration and checking the checkbox "Include dependencies with Provided scope"

enter image description here

After this my SparkSQL code also works fine:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
+---------+--------------------+-------------+-----+-------+--------+----------+
|      _c0|                 _c1|          _c2|  _c3|    _c4|     _c5|       _c6|
+---------+--------------------+-------------+-----+-------+--------+----------+
|IATA_CODE|             AIRPORT|         CITY|STATE|COUNTRY|LATITUDE| LONGITUDE|
|      ABE|Lehigh Valley Int...|    Allentown|   PA|    USA|40.65236| -75.44040|
|      ABI|Abilene Regional ...|      Abilene|   TX|    USA|32.41132| -99.68190|
|      ABQ|Albuquerque Inter...|  Albuquerque|   NM|    USA|35.04022|-106.60919|
|      ABR|Aberdeen Regional...|     Aberdeen|   SD|    USA|45.44906| -98.42183|
|      ABY|Southwest Georgia...|       Albany|   GA|    USA|31.53552| -84.19447|
|      ACK|Nantucket Memoria...|    Nantucket|   MA|    USA|41.25305| -70.06018|
|      ACT|Waco Regional Air...|         Waco|   TX|    USA|31.61129| -97.23052|
|      ACV|      Arcata Airport|Arcata/Eureka|   CA|    USA|40.97812|-124.10862|
|      ACY|Atlantic City Int...|Atlantic City|   NJ|    USA|39.45758| -74.57717|
|      ADK|        Adak Airport|         Adak|   AK|    USA|51.87796|-176.64603|
|      ADQ|      Kodiak Airport|       Kodiak|   AK|    USA|57.74997|-152.49386|
|      AEX|Alexandria Intern...|   Alexandria|   LA|    USA|31.32737| -92.54856|
|      AGS|Augusta Regional ...|      Augusta|   GA|    USA|33.36996| -81.96450|
|      AKN| King Salmon Airport|  King Salmon|   AK|    USA|58.67680|-156.64922|
|      ALB|Albany Internatio...|       Albany|   NY|    USA|42.74812| -73.80298|
|      ALO|Waterloo Regional...|     Waterloo|   IA|    USA|42.55708| -92.40034|
|      AMA|Rick Husband Amar...|     Amarillo|   TX|    USA|35.21937|-101.70593|
|      ANC|Ted Stevens Ancho...|    Anchorage|   AK|    USA|61.17432|-149.99619|
|      APN|Alpena County Reg...|       Alpena|   MI|    USA|45.07807| -83.56029|
+---------+--------------------+-------------+-----+-------+--------+----------+
only showing top 20 rows

But when I'm trying to execute Maven "clean -> package" commands, I get several errors, and all of them are regarding "org.apache" package:

D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:3: error: object apache is not a member of package org
import org.apache.spark.sql.{DataFrame, SparkSession}
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:4: error: object apache is not a member of package org
import org.apache.log4j.Logger
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:5: error: object apache is not a member of package org
import org.apache.log4j.Level
           ^
D:\Work Projects\SparkWordCount\src\main\scala\SparkWordCount.scala:10: error: not found: value Logger
    Logger.getLogger("org").setLevel(Level.ERROR)

This is details about my environment, probably some of them will be helpful:

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

enter image description here

Previoulsy I've had HADOOP_HOME in Path, but I removed it on advice from one of related threads on StackOverflow.

Thank you in advance for your help!

UPDATE-1 My project structure:

enter image description here

UPDATE-2

If it's important: I don't have MAVEN_HOME environment variable, and therefore there is no MAVEN_HOME in my Path. I executed "clean -> package" via Intellij IDEa Maven interface.

UPDATE-3

List of libraries from Project Structure

enter image description here enter image description here enter image description here enter image description here

Update-4 Information about scala-sdk: enter image description here

Upvotes: 4

Views: 2746

Answers (1)

s.polam
s.polam

Reputation: 10362

Remove <scope>provided</scope> from pom.xml file. Go to file tab click on Invalidate Caches / Restart option & Try again.

For maven issue - try mvn clean scala:compile -DdisplayCmd=true -DrecompileMode=all package command.

Upvotes: 1

Related Questions