Smruti Ranjan
Smruti Ranjan

Reputation: 99

Hive on Spark is not working - Failed to create spark client

I am getting below error while executing hive query as spark engine.

Error:
    Failed to execute spark task, with exception org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
    FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask

Hive Console:
    hive> set hive.execution.engine=spark;
    hive> set spark.master=spark://INBBRDSSVM15.example.com:7077;
    hive> set spark.executor.memory=2g;

    Hadoop - 2.7.0
    Hive - 1.2.1
    Spark - 1.6.1

Upvotes: 3

Views: 5570

Answers (1)

Ram Ghadiyaram
Ram Ghadiyaram

Reputation: 29145

The YARN Container Memory was smaller than the Spark Executor requirement. I set the YARN Container memory and maximum to be greater than Spark Executor Memory + Overhead. Check 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.

Please see Source here

Upvotes: 1

Related Questions