dfouheqoijefoih
dfouheqoijefoih

Reputation: 13

object Neo4j is not a member of package org.neo4j.spark

I am trying to build Spark application using neo4j connector but I am getting an error:

object Neo4j is not a member of package org.neo4j.spark
import org.neo4j.spark.Neo4j

when I try to import Neo4j. My build.sbt file looks like this:

libraryDependencies += "org.neo4j" % "neo4j-connector-apache-spark_2.12" % "4.1.4_for_spark_2.4"

//libraryDependencies += "org.neo4j" % "neo4j-connector-apache-spark_2.11" % "4.0.1_for_spark_2.4"

//libraryDependencies += "neo4j-contrib" %% "neo4j-spark-connector" % "2.1.0-M4"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"

And my code in scala:

import org.apache.spark.{SparkConf, SparkContext}
import org.neo4j.spark.Neo4j

object first {

  def main(args: Array[String]): Unit = {

    def main(args: Array[String]) {

      val conf = new SparkConf()
        .setAppName("Test-neo4j-skel") // App Name
        .setMaster("local[*]") // local mode

      val sc = new SparkContext(conf)

      val neo = Neo4j(sc)

      val rdd = neo.cypher("MATCH (n:Person) RETURN id(n) as id ").loadRowRdd
      rdd.count

Thank You.

Upvotes: 0

Views: 179

Answers (1)

conker84
conker84

Reputation: 45

import org.neo4j.spark.Neo4j you're leveraging the old APIs.

Please read the documentation:

https://neo4j.com/docs/spark/current/quickstart/

this in particular contains info about how to read data with a Cypher query:

https://neo4j.com/docs/spark/current/reading/#read-query

Upvotes: 1

Related Questions