user6325753
user6325753

Reputation: 577

Object sql is not a member of package org.apache.spark

I am trying to work with spark-sql but while I importing

import org.apache.spark.sql.{Row, SparkSession}

getting following error:

object sql is not a member of package org.apache.spark

Here are my details:

Spark version: 1.6.2 Scala version: 2.11.8 sbt version: 0.13.16

Here is my build.sbt file:

name := "sbt_demo"
version := "1.0"
scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.2"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.6.2"

Note: there is another question with the same problem in Stack Overflow, but that accepted answer didn't help me. That is the reason why I am asking again.

Upvotes: 3

Views: 13390

Answers (3)

Daniel
Daniel

Reputation: 616

I had the same problem and this is my build.sbt file

name := "rec"
version := "0.1"
scalaVersion := "2.11.12"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.0"

some says this command below will work

sbt reload package    
sbt update 
sbt reload

but it didn't work for me. so i deleted .idea file and re-import everything in build.sbt file and it works well for me

Upvotes: 2

Sandeep Das
Sandeep Das

Reputation: 1050

For sbt you can use

"org.apache.spark" %% "spark-sql" % "1.6.2" % "provided",

For Maven

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.6.2</version>
    </dependency>

Use below import just before when u create dataframe . inside your code

import sqlContext.implicits._
val df = sqlContext.createDataFrame(rows, schema)

Upvotes: 1

Adrian
Adrian

Reputation: 21

You need maven central in your Resolver of sbt. And if behind proxy, set your proxy properly for SBT.

Also, in spark 1.6.2, there's no SparkSession... You should use SQLContext, or move to 2.x

Upvotes: 0

Related Questions