Reputation: 1175
I have a table "Gazelle" with 216 columns, and i want to get some of their columns in a javaPairRDD. I've tried to follow this link :
How to read from hbase using spark and this one : how to fetch all of data from hbase table in spark
In order to import all jars i need i've added this dependency to my pom file :
'<?xml version="1.0" encoding="UTF-8"?>
http://maven.apache.org/xsd/maven-4.0.0.xsd"> 4.0.0
<groupId>fr.aid.cim</groupId>
<artifactId>spark-poc</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>0.96.0-hadoop2</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase</artifactId>
<version>0.20.6</version>
</dependency>
</dependencies>
</project>'
and this is my code :
'SparkConf sparkConf = new SparkConf().setAppName("JavaWordCount");
JavaSparkContext ctx = new JavaSparkContext(sparkConf);
//JavaSQLContext jsql = new JavaSQLContext(sc);
//test hbase table
HBaseConfiguration conf = new HBaseConfiguration();
conf.set("hbase.zookeeper.quorum", "192.168.10.32");
conf.set("hbase.zookeeper.property.clientPort","2181");
conf.set("hbase.master", "192.168.10.32" + ":60000");
conf.set("hbase.cluster.distributed", "true");
conf.set("hbase.rootdir", "hdfs://localhost:8020/hbase");
//conf.set(TableInputFormat.INPUT_TABLE, "gazelle_hive4");
String tableName = "gazelle_hbase4";
HTable table = new HTable(conf,tableName);
JavaPairRDD<ImmutableBytesWritable, Result> hBaseRDD = ctx
.newAPIHadoopRDD(
conf,
TableInputFormat.class,,
org.apache.hadoop.hbase.io.ImmutableBytesWritable.class,
org.apache.hadoop.hbase.client.Result.class);
hBaseRDD.coalesce(1, true).saveAsTextFile(path + "hBaseRDD");'
But i have a problem with "TableInputFormat"
Error : Cannot resolve symbol TableInputFormat. Is their another library i should import or another dependency i should add?
Note: I haven't created any XML file. Should i create the "hbase-default.xml" and "hbase-site.xml" ? If yes, how ?
Thank you in advance for your help.
Upvotes: 1
Views: 6172
Reputation: 2773
According to this thread in Apache Spark user list, you might need a few more things.
If the error is happening at runtime, you should explicitly specify the hbase jars to Spark.
spark-submit --driver-class-path $(hbase classpath) --jars /usr/lib/hbase/hbase-server.jar,/usr/lib/hbase/hbase-client.jar,/usr/lib/hbase/hbase-common.jar,/usr/lib/hbase/hbase-protocol.jar,/usr/lib/hbase/lib/protobuf-java-2.5.0.jar,/usr/lib/hbase/lib/htrace-core.jar --class YourClassName --master local App.jar
If the error is happening at compile time, you might be missing a dependency. (hbase-server as stated in the thread.)
Upvotes: 1