Reputation: 446
I created a .java file to run on cloudera hadoop. To compile it,
javac -classpath $HADOOP_COMMON_HOME/share/hadoop/mapreduce/hadoop-mapreduce-client- core-3.0.0-SNAPSHOT.jar -d multifetch_classes MultiFetch.java
Errors:
MultiFetch.java:12: package org.apache.hadoop.conf does not exist
import org.apache.hadoop.conf.Configuration;
^
MultiFetch.java:13: package org.apache.hadoop.conf does not exist
import org.apache.hadoop.conf.Configured;
^
MultiFetch.java:14: package org.apache.hadoop.fs does not exist
import org.apache.hadoop.fs.Path;
^
MultiFetch.java:15: package org.apache.hadoop.io does not exist
import org.apache.hadoop.io.Text;
Am i selecting wrong jar to execute the file or am i following wrong procedure to compile the java file. Please do let some correct this error.
Upvotes: 1
Views: 4031
Reputation: 20566
Most of the above errors are related with not able to find Hadoop libraries to compile your sample application.
A sample Java based Map/Reduce sample build command is as below:
$javac \
-classpath ${HADOOP_HOME}/hadoop-${HADOOP_VERSION}-core.jar \
-d wordcount_classes \
WordCount.java
In my case, the jar file is hadoop-0.20.203.1-SNAPSHOT-core.jar
. If you take a look at your command you will see there is something not right:
javac -classpath $HADOOP_COMMON_HOME/share/hadoop/mapreduce/hadoop-mapreduce-client- core-3.0.0-SNAPSHOT.jar -d multifetch_classes MultiFetch.java
What you can do is you can build you sample just by using classpath
as given in sample above.
Upvotes: 1