user1585111
user1585111

Reputation: 1019

Getting Hbase Exception No regions passed

Hi i am new to Hbase and im trying to learn how to load bulk data to Hbase table using MapReduce

But i am getting below Exception

Exception in thread "main" java.lang.IllegalArgumentException: No regions passed at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.writePartitions(HFileOutputFormat2.java:307) at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(HFileOutputFormat2.java:527) at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:391) at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:356) at JobDriver.run(JobDriver.java:108) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at JobDriver.main(JobDriver.java:34) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

This is mY Mapper Code

public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
		
		
		System.out.println("Value in Mapper"+value.toString());
		String[] values = value.toString().split(",");
         byte[] row = Bytes.toBytes(values[0]);
         ImmutableBytesWritable k = new ImmutableBytesWritable(row);
         KeyValue kvProtocol = new KeyValue(row, "PROTOCOLID".getBytes(), "PROTOCOLID".getBytes(), values[1]
                         .getBytes());
         context.write(k, kvProtocol);
}

This is my Job Configuration

public class JobDriver extends Configured implements Tool{

	public static void main(String[] args) throws Exception {
		// TODO Auto-generated method stub
		ToolRunner.run(new JobDriver(), args);
		System.exit(0);

	}

	@Override
	public int run(String[] arg0) throws Exception {
		// TODO Auto-generated method stub
		
		// HBase Configuration
		System.out.println("**********Starting Hbase*************");
				Configuration conf = HBaseConfiguration.create();
                Job job = new Job(conf, "TestHFileToHBase");
                job.setJarByClass(JobDriver.class);
                job.setOutputKeyClass(ImmutableBytesWritable.class);
                job.setOutputValueClass(KeyValue.class);
                job.setMapperClass(LoadMapper.class);
                job.setOutputFormatClass(HFileOutputFormat2.class);                
                HTable table = new HTable(conf, "kiran");
                FileInputFormat.addInputPath(job, new Path("hdfs://192.168.61.62:9001/sampledata.csv"));
                FileOutputFormat.setOutputPath(job, new Path("hdfs://192.168.61.62:9001/deletions_6.csv"));
                HFileOutputFormat2.configureIncrementalLoad(job, table);
                //System.exit(job.waitForCompletion(true) ? 0 : 1);
				return job.waitForCompletion(true) ? 0 : 1;
	}
}

Can Anyone please help me in resolvin the exception.

Upvotes: 2

Views: 1366

Answers (1)

TheRoyal Llama
TheRoyal Llama

Reputation: 56

You have to create the table first. You can do it with the below code

//Create table and do pre-split
HTableDescriptor descriptor = new HTableDescriptor(
Bytes.toBytes(tableName)
);

descriptor.addFamily(
new HColumnDescriptor(Constants.COLUMN_FAMILY_NAME)
);

HBaseAdmin admin = new HBaseAdmin(config);

byte[] startKey = new byte[16];
Arrays.fill(startKey, (byte) 0);

byte[] endKey = new byte[16];
Arrays.fill(endKey, (byte)255);

admin.createTable(descriptor, startKey, endKey, REGIONS_COUNT);
admin.close();

or directly from the hbase shell with the command:

create 'kiran', 'colfam1'

The exception is caused because the startkeys list is empty: line 306

More info can be found here.

Note that the table name must be the same with the one you use in your code (kiran).

Upvotes: 4

Related Questions