Reputation: 2870
I'm trying to run my jar on hadoop filesystem but getting this exception See image
My code is running successfully if i run it from eclipse.
Here is my Runner main class
public class ReadCassandra extends Configured implements Tool{
public static void main(String args[]){
try{
/*ToolRunner.run(new Configuration(),new ReadCassandra(), args);
System.exit(0);*/
String keyspace ="Read_log";
String clg ="readValidPost";
String rowkey="117761667160131";
List<ByteBuffer> cn = new ArrayList<ByteBuffer>();
List<String> cl = new ArrayList<String>();
cl.addAll(MyHector.getColumn(rowkey));
for (String string : cl) {
ByteBuffer bf = ByteBufferUtil.bytes(string);
cn.add(bf);
}
Configuration conf = new Configuration();
Job job = new Job(conf,"MEJfsd");
//job.setJarByClass(ReadCassandra.class);
job.setInputFormatClass(AbstractColumnFamilyInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setMapperClass(MyMapper.class);
job.setReducerClass(MyReducer.class);
ConfigHelper.setInputRpcPort(job.getConfiguration(), "9160");
ConfigHelper.setInputInitialAddress(job.getConfiguration(), "127.0.0.1");
ConfigHelper.setInputPartitioner(job.getConfiguration(), "org.apache.cassandra.dht.Murmur3Partitioner");
ConfigHelper.setInputColumnFamily(job.getConfiguration(), keyspace, clg);
SlicePredicate predicate = new SlicePredicate().setColumn_names(cn);
ConfigHelper.setInputSlicePredicate(job.getConfiguration(), predicate);
FileSystem.get(job.getConfiguration()).delete(new Path("Output"), true);
FileOutputFormat.setOutputPath(job, new Path("Output"));
job.waitForCompletion(true);
}catch(Exception e){
e.printStackTrace();
}
}
public int run(String[] arg0) throws Exception {
return (1);
}
}
I'm running using command
hadoop jar /home/winoria/Documents/JarFiles/ReadCas.jar ReadCassandra
Upvotes: 0
Views: 174
Reputation: 20245
You are setting your input format class to an abstract class:
job.setInputFormatClass(AbstractColumnFamilyInputFormat.class);
You need to set it to something instantiate-able. But I don't know about Cassandra-Hadoop integration and what makes sense in your case. Just make sure that it is not an abstract class.
Upvotes: 1