Danny
Danny

Reputation: 165

DataFrame Spark not found in Java class

I'm coding a Java class, using Spark. I have this error: "DataFrame cannot be resolved to a type" and the error about import: "The import org.apache.spark.sql.DataFrame" cannot be resolved. This is the class import:

import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.sql.DataFrameReader;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SQLContext;

import org.apache.spark.sql.DataFrame;

This is the file pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>SparkBD</groupId>
    <artifactId>SparkProject</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <dependencies>
        <dependency> <!-- Spark dependency -->
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
    </dependencies>
</project>

Upvotes: 3

Views: 2550

Answers (1)

user9789575
user9789575

Reputation:

DataFrame has been removed in Java API (in Scala API it is just an alias) in Spark 2.0. You should replace it with Dataset<Row>.

  • Keep only import org.apache.spark.sql.Dataset
  • Wherever you used DataFrame use Dataset<Row>

Upvotes: 7

Related Questions