Asha Koshti
Asha Koshti

Reputation: 2993

NullPointerException when working with Coprocessor in HBASE?

I am using HBASE 0.94.8 with HDFS.I have Implemented Co-processor for Summation of values.The table has only two rows

hbase(main):043:0> scan 'demo'

ROW COLUMN+CELL

row1 column=info:category, timestamp=1375438808010, value=web
row1 column=info:hits, timestamp=1375438797824, value=123
row2 column=info:category, timestamp=1375438834518,value=mail
row2 column=info:hits, timestamp=1375438822093, value=1321

hbase(main):043:0> describe 'demo'

'demo', {METHOD => 'table_att', coprocessor$1 => '|org.apache.hadoop.hbase.coprocess true
or.AggregateImplementation||'}, {NAME => 'info', DATA_BLOCK_ENCODING => 'NONE', BLOO MFILTER => 'NONE', REPLICATION_SCOPE => '0', VERSIONS => '3', COMPRESSION => 'NONE',
MIN_VERSIONS => '0', TTL => '2147483647', KEEP_DELETED_CELLS => 'false', BLOCKSIZE => '65536', IN_MEMORY => 'false', ENCODE_ON_DISK => 'true', BLOCKCACHE => 'true'} 1 row(s) in 0.0670 seconds

MY code is given below:

import org.apache.hadoop.conf.Configuration; 
import org.apache.hadoop.hbase.HBaseConfiguration; 
import org.apache.hadoop.hbase.HColumnDescriptor; 
import org.apache.hadoop.hbase.HTableDescriptor; 
import org.apache.hadoop.hbase.KeyValue; 
import org.apache.hadoop.hbase.client.HBaseAdmin; 
import org.apache.hadoop.hbase.client.HTable; 
import org.apache.hadoop.hbase.client.Result; 
import org.apache.hadoop.hbase.client.ResultScanner; 
import org.apache.hadoop.hbase.client.Scan; 
import org.apache.hadoop.hbase.client.coprocessor.AggregationClient; 
import org.apache.hadoop.hbase.client.coprocessor.LongColumnInterpreter;
import org.apache.hadoop.hbase.util.Bytes; 
import org.apache.hadoop.hbase.coprocessor.ColumnInterpreter; 
import org.apache.hadoop.hbase.coprocessor.CoprocessorHost;
public class webAggregator {

   // private static final byte[] EDRP_FAMILY = Bytes.toBytes("EDRP");
   // private static final byte[] EDRP_QUALIFIER = Bytes.toBytes("advanceKWh");
   public static void testSumWithValidRange(Configuration conf,
                 String[] otherArgs) throws Throwable {
          byte[] EDRP_TABLE = Bytes.toBytes(otherArgs[0]);
          byte[] EDRP_FAMILY = Bytes.toBytes(otherArgs[1]);
          byte[] EDRP_QUALIFIER = Bytes.toBytes(otherArgs[2]);

          conf.set("hbase.zookeeper.quorum", "master");
          conf.set("hbase.zookeeper.property.clientPort", "2222");

          conf.setLong("hbase.rpc.timeout", 600000);

          conf.setLong("hbase.client.scanner.caching", 1000);
          conf.set(CoprocessorHost.REGION_COPROCESSOR_CONF_KEY,
                       "org.apache.hadoop.hbase.coprocessor.AggregateImplementation");

          // Utility.CreateHBaseTable(conf, otherArgs[1], otherArgs[2], true);
          /*HBaseAdmin admin = new HBaseAdmin(conf);
          HTableDescriptor desc = new HTableDescriptor(EDRP_TABLE);
          desc.addFamily(new HColumnDescriptor(EDRP_FAMILY));
          admin.createTable(desc);*/

          AggregationClient aClient = new AggregationClient(conf);
          Scan scan = new Scan();
          scan.addColumn(EDRP_FAMILY, EDRP_QUALIFIER);


          HTable table = new HTable(conf, "demo");
          Scan s = new Scan();
          ResultScanner ss = table.getScanner(s);
          for(Result r:ss){
              for(KeyValue kv : r.raw()){
                 System.out.print(new String(kv.getRow()) + " ");
                 System.out.print(new String(kv.getFamily()) + ":");
                 System.out.print(new String(kv.getQualifier()) + " ");
                 System.out.print(kv.getTimestamp() + " ");
                 System.out.println(new String(kv.getValue()));
              }
          }

          final ColumnInterpreter<Long, Long> ci = new LongColumnInterpreter();
          long sum = aClient.sum(Bytes.toBytes(otherArgs[0]), ci, scan);
          System.out.println(sum);
   }

   /**
   * Main entry point.
   *
   * @param argsThe
   *            command line parameters.
   * @throws Exception
   *             When running the job fails.
   */
   public static void main(String[] args) throws Exception {
     Configuration conf = HBaseConfiguration.create();
       String[] otherArgs ={"demo","info","hits"};
      try {
         testSumWithValidRange(conf, otherArgs);
       } catch (Throwable e) {
         e.printStackTrace();
       }
   } }

MY stack trace is given below:

java.lang.NullPointerException at webAggregator.testSumWithValidRange(webAggregator.java:62) at webAggregator.main(webAggregator.java:79)

Please help on this.

Upvotes: 0

Views: 1569

Answers (2)

SmallWong
SmallWong

Reputation: 71

The question is about the data type.

A. put.addColumn(Bytes.toBytes("objects"), Bytes.toBytes("info"), Bytes.toBytes(1.0));

max/min/sum is OK. But

B. put.addColumn(Bytes.toBytes("objects"), Bytes.toBytes("info"), Bytes.toBytes("1.0"));

It is not OK. And from hbase-shell, we can see

A. column=objects:info, timestamp=1525942759312, value=?\xF0\x00\x00\x00\x00\x00\x00
B. column=objects:info, timestamp=1525941054901, value=1.0

Upvotes: 0

user966085
user966085

Reputation: 303

I got same error with you. After some investigation, I find the problem is my column type is integer, so the LongColumnInterpreter.getValue method returns null.

From your code and result, I am sure that your 'info:hits' column is a string column but not a long column.

Just consider to change hits to real long column , from hbase shell its value should look like

11Ak8Z4Mswtk00:MXf1NZ                        column=f1:dp, timestamp=1400144073173, value=\x00\x00\x00\x00\x00\x00\x00b 

Or you can write a ColumnInterpreter yourself to handle string value sum.

Upvotes: 0

Related Questions