spark_user
spark_user

Reputation: 97

Hive query from Spark- failed to parse

I am trying to do this in spark-shell:

val hiveCtx = new org.apache.spark.sql.hive.HiveContext(sc)
val listTables = hiveCtx.hql("show tables")

The second line fails to execute with this message:

warning: there were 1 deprecation warning(s); re-run with -deprecation for details org.apache.spark.sql.hive.HiveQl$ParseException: Failed to parse: show tables at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:239) at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:50) at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:49) at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136) at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135) at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)

... at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.NullPointerException: Conf non-local session path expected to be non-null at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:204) at org.apache.hadoop.hive.ql.session.SessionState.getHDFSSessionPath(SessionState.java:586) at org.apache.hadoop.hive.ql.Context.(Context.java:129) at org.apache.hadoop.hive.ql.Context.(Context.java:116) at org.apache.spark.sql.hive.HiveQl$.getAst(HiveQl.scala:227) at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:240) ... 87 more

Any help would be appreciated. Thanks.

Upvotes: 1

Views: 3832

Answers (1)

reim
reim

Reputation: 612

I encountered the very same error in my Spark application. A sample:

trait EventStreamEnrichmentData {

    protected def streamingContext: StreamingContext

    def makeHiveContext() = Try { new HiveContext(streamingContext.sparkContext) }

    /**
    * context needed to talk to the Hive metastore
    */
    @transient private val hiveContext = makeHiveContext()

    private def currentContext: HiveContext = ???  

    private def makeAnagrafica(): Try[AnagraficaTable] = currentContext flatMap ( makeAnagrafica(_) )
    @transient protected var anagrafica = makeAnagrafica()

}

Now if you have:

private def currentContext: HiveContext = hiveContext

if my understanding is correct you are using a context initialized on the driver whereas with:

private def currentContext: HiveContext = makeHiveContext()

it depends on the caller, it might also be created in the executors.

In our particular case we get the exception in the former scenario, with the driver-initialized context, but details aside the takeaway is to be careful about where the context lives.

I did not investigate further but the exception actually comes from here.

Upvotes: 1

Related Questions