Litchy
Litchy

Reputation: 363

object hbase is not a member of package org.apache.spark.sql.execution.datasources

I am trying to use Spark-Hbase-Connector to get data from HBase

import org.apache.spark.sql.execution.datasources.hbase._

the error is

object hbase is not a member of package org.apache.spark.sql.execution.datasources

in my local .m2 repository there already exists the .jar of org.apache.hbase.hbase-spark... I really wonder where is this package (The Object I want to use in this package is HBaseTableCatalog)

the part of the pom.xml is

<dependency>
  <groupId>org.apache.hbase</groupId>
  <artifactId>hbase-spark</artifactId>
  <version>3.0.0-SNAPSHOT</version>
</dependency>

Upvotes: 1

Views: 4323

Answers (1)

Ramesh Maharjan
Ramesh Maharjan

Reputation: 41987

It is clearly mentioned in the shc site the followings

Users can use the Spark-on-HBase connector as a standard Spark package. To include the package in your Spark application use: Note: com.hortonworks:shc-core:1.1.1-2.1-s_2.11 has not been uploaded to spark-packages.org, but will be there soon. spark-shell, pyspark, or spark-submit $SPARK_HOME/bin/spark-shell --packages com.hortonworks:shc-core:1.1.1-2.1-s_2.11 Users can include the package as the dependency in your SBT file as well. The format is the spark-package-name:version in build.sbt file. libraryDependencies += “com.hortonworks/shc-core:1.1.1-2.1-s_2.11”

So you will have to download the jar and include it manually in your project for the testing purpose if you are using maven.

Or you can try maven uploaded shc

Upvotes: 2

Related Questions