Aman
Aman

Reputation: 3261

How to create SQLContext in spark using scala?

I am creating a Scala program to SQLContext using sbt. This is my build.sbt:

name := "sampleScalaProject"

version := "1.0"

scalaVersion := "2.11.7"
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2"
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"  

And this is test program:

import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext

object SqlContextSparkScala {

  def main (args: Array[String]) {
    val sc = SparkContext
    val sqlcontext = new SQLContext(sc)
  }
} 

I am getting below error:

Error:(8, 26) overloaded method constructor SQLContext with alternatives:
  (sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext <and>
  (sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
 cannot be applied to (org.apache.spark.SparkContext.type)
    val sqlcontexttest = new SQLContext(sc)  

Can anybody please let me know the issue as I am very new to scala and spark programming?

Upvotes: 8

Views: 39279

Answers (5)

Kuldeep J
Kuldeep J

Reputation: 1

If you are using scala shell then use below statement

val sqlContext = spark.sqlContext

And to read parquet files use below statement

val df = sqlContext.read.parquet("/path/to/folder/cotaning/arquet/files/")

Upvotes: 0

Shaido
Shaido

Reputation: 28322

For newer versions of Spark (2.0+), use SparkSession:

val spark = SparkSession.builder.getOrCreate()

SparkSession can do everything SQLContext can do but if needed the SQLContext can be accessed as follows,

val sqlContext = spark.sqlContext

Upvotes: 12

Viraj Wadate
Viraj Wadate

Reputation: 6123

Simply we can create SQLContext in scala

scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc);    

Upvotes: 4

Ashutosh Shukla
Ashutosh Shukla

Reputation: 555

val conf = new SparkConf().setAppName("SparkJoins").setMaster("local")
val sc = new SparkContext(conf);
val sqlContext = new org.apache.spark.sql.SQLContext(sc);    

Upvotes: 0

Justin Pihony
Justin Pihony

Reputation: 67065

You need to new your SparkContext and that should solve it

Upvotes: 6

Related Questions