Mbula Guy Marcel
Mbula Guy Marcel

Reputation: 81

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration

i'm trying to create a table on Hbase (on a specified cluster) and i try the following code :

import org.apache.hadoop.hbase.client.{HTable, Put, HBaseAdmin}
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor, HColumnDescriptor}

object ImportData {
   var cf = "d"
  def createTable (TableName : String, NameSpace : String , reset : Boolean): HTable =  {

    // initialize configuration and admin
    val Hbaseconfig  = HBaseConfiguration.create()
    val hbaseAdmin = new HBaseAdmin (Hbaseconfig)

    // check if table exsit )
    if (!hbaseAdmin.isTableAvailable(NameSpace + ":" + TableName) || (reset)) {
         if (hbaseAdmin.isTableAvailable(NameSpace + ":" + TableName) && (reset)) { //force delete table
        hbaseAdmin.disableTable(NameSpace + ":" + TableName)
        hbaseAdmin.deleteTable(NameSpace + ":" + TableName)
      }

         val tableDesc = new HTableDescriptor ((NameSpace + ":" + TableName).getBytes())
         val tableFamily = new HColumnDescriptor(cf)

         // Adding column families to table descriptor
         tableDesc.addFamily(tableFamily)
         // create table
         hbaseAdmin.createTable(tableDesc)
    }


    Hbaseconfig.set(TableInputFormat.INPUT_TABLE,  NameSpace + ":" + TableName)
    val Table = new HTable(Hbaseconfig, NameSpace + ":" + TableName)
    println (">>> Table " + Table + "created on Hbase")
    return Table
  }
  // put data in table
  def writetotable(table : HTable, columnname : List[String],value : List[String]){
    val Lsize = columnname.size-1
    var p = new Put(Bytes.toBytes("row1")); 
    for ( i <- 0 to Lsize){
       p.add(Bytes.toBytes(cf), Bytes.toBytes(columnname(i)),Bytes.toBytes(value(i)));
    }

   table.put (p); 
   table.close()

  }

}

I run it with spark on a HUE server but i have the folowing error :

17/10/10 12:04:34 ERROR ApplicationMaster: User class threw exception: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
**java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration**
    at com.renault.fic_histo.parsing.ImportData$.createTable(ImportData.scala:13)
    at com.renault.fic_histo.parsing.Global_Main.save_fic_histo(Global_Main.scala:32)
    at com.renault.fic_histo.parsing.Global_Main$.main(Global_Main.scala:47)
    at com.renault.fic_histo.parsing.Global_Main.main(Global_Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)
**Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration**
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

I read that i have to added the hbase classpath in the hadoop-env.sh file with the following code :

$ export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.22.jar:\
    $HBASE_HOME/hbase-0.94.22-test.jar:\
    $HBASE_HOME/conf:\
    ${HBASE_HOME}/lib/zookeeper-3.4.5.jar:\
    ${HBASE_HOME}/lib/protobuf-java-2.4.0a.jar:\
    ${HBASE_HOME}/lib/guava-11.0.2.jar

Here are my question : 1. I am not running it on local, so i can't change this configuration. What can i do to solve this problem ?
2. Should i connect or specify the hbase cluster in my code ? can someone give me a good tutorial to lurn it ?

Thank you

Upvotes: 1

Views: 7177

Answers (1)

Shubhangi
Shubhangi

Reputation: 2254

The hbase-common.jar contains class definition for org.apache.hadoop.hbase.HBaseConfiguration. Hence you need to include hbase-common.jar in HADOOP_CLASSPATH.

Upvotes: 2

Related Questions