codeBarer
codeBarer

Reputation: 2378

In Spark I am not able to create a table using hive support

I'm trying follow the examples from

Not able to find spark-warehouse directory

to create a table using hive support but I keep getting this error message:

org.apache.spark.sql.AnalysisException: Hive support is required to CREATE Hive TABLE (AS SELECT);
'CreateTable `default`.`sales`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, ErrorIfExists
+- Project [num#2]
   +- SubqueryAlias test
      +- View (`test`, [num#2])
         +- Project [1 AS num#2]
            +- OneRowRelation

  at org.apache.spark.sql.errors.QueryCompilationErrors$.ddlWithoutHiveSupportEnabledError(QueryCompilationErrors.scala:1270)
  at org.apache.spark.sql.execution.datasources.HiveOnlyCheck$.$anonfun$apply$4(rules.scala:438)

Below is my code

import org.apache.spark.sql.SparkSession
import org.apache.spark.SparkContext
import java.io.File

val sc = new SparkContext("local[*]", "LoadFiles1") 
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val warehouseLocation =  new File("spark-warehouse").getAbsolutePath
val spark = SparkSession
    .builder()
    .appName("Spark Hive Example")
    .config("spark.sql.warehouse.dir", warehouseLocation)
    .enableHiveSupport()
    .getOrCreate()    

spark.sql("select 1 as num ").registerTempTable("test")
spark.sql("create table sales as select * from test")

Upvotes: 0

Views: 531

Answers (1)

Gumada Yaroslav
Gumada Yaroslav

Reputation: 177

It seems to me that you are missing a parameter. It worked for me then i add ("spark.sql.catalogImplementation","hive") in the spark config.

With such parameters, the code you wanted to run worked in my pc:

  val conf = new SparkConf()
    .set("spark.driver.bindAddress", "127.0.0.1")
    .set("spark.sql.warehouse.dir", "/Users/19658296/csp-fp-snaphot/library/src/test/resources/warehouseLocation")
    .set("spark.sql.catalogImplementation","hive")

  val spark = SparkSession.builder.master("local[*]")
    .appName("testGetSnp")
    .config(conf)
    .getOrCreate
  spark.sparkContext.hadoopConfiguration.set("fs.defaultFS", "file:///")

Also try to reed this answer it's look releted.

Upvotes: 1

Related Questions