Reputation: 645
I'm trying to run the example found here: http://thysmichels.com/2012/01/31/java-based-hdfs-api-tutorial/
But when I go to compile the java program I get errors saying the packages don't exist eg.
error: package org.apache.hadoop.conf does not exist
import org.apache.hadoop.conf.Configuration;
Hadoop 1.0.4 is installed and works fine. Every tutorial I've looked at for dealing with hdfs just starts with a program like in the link I provided earlier and they do not talk about any special prereqs I need. So I'm wondering what do I need to do to make this compile? I'm assuming I need to edit my classpath to point to the appropriate packages but I do not know where those are located.
Also I'm running Ubuntu 12.04, Hadoop 1.0.4 on a single node cluster following the instructions here: http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
Upvotes: 0
Views: 341
Reputation: 30089
I'd suggest you brush up on some basic java compilation basics.
You need to do more than just a javac HDFSExample.java
- in that you need to include some dependency jars on the classpath. Something more like javac -cp hadoop-core-1.0.4.jar HDFSExample.java
Personally, i'd recommend looking into using a build tool (such as Maven, Ant) or an IDE as this will make things far less painful when you start to organize your code into packages and depend on multiple external libraries.
EDIT: For example, maven configuration is as simple as (ok i'm not including some of the other boiler plate pom decalarations..):
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.0.4</version>
<scope>provided</scope>
</dependency>
<dependency>
Then to compile into a jar:
#> mvn jar
Upvotes: 1