Pratik Joshi
Pratik Joshi

Reputation: 258

Executing Spring Boot Application using Spark

I am new to Apache Spark. I am trying to execute simple spring boot application using Spark but I am getting the exception.

ERROR ApplicationMaster: User class threw exception: 
java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication
java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication
Caused by: java.lang.ClassNotFoundException: org.springframework.boot.SpringApplication

However, i am able to execute this project perfectly file from my Eclipse IDE.It is executing the soul which I have kept.

<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>1.3.3.RELEASE</version>
    <relativePath/> <!-- lookup parent from repository -->
</parent>

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <java.version>1.8</java.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter</artifactId>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>


    <dependency> <!-- Spark -->
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.4.0</version>
    </dependency>
    <dependency> <!-- Spark SQL -->
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.4.0</version>
    </dependency>
    <dependency> <!-- Spark SQL -->
        <groupId>com.fasterxml.jackson.module</groupId>
        <artifactId>jackson-module-scala_2.10</artifactId>
        <version>2.6.5</version>
    </dependency>
</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>2.3.2</version>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
            </configuration>
        </plugin>
    </plugins>
</build>

And my main spring boot class is

@SpringBootApplication
public class SparkS3Application {

   public static void main(String[] args) {
       SpringApplication.run(SparkS3Application.class, args);
       System.out.println(" *************************** called *******************");
   }
}

Upvotes: 1

Views: 2975

Answers (1)

Pratik Joshi
Pratik Joshi

Reputation: 258

I added the required dependency in my spark submit command only using --jars "jar path,another jar path". You need to provide all the jars comma separated after --jars.

Second thing is try to execute this in spark 2.0 I was using park 1.6 and i was facing issue but it is working perfectly fine with spark 2.0.

Hope this will help you guys..

Upvotes: 3

Related Questions