Kaushik Lele
Kaushik Lele

Reputation: 6637

Jars for hadoop mapreduce

I am following this hadoop mapreduce tutorial given by Apache. The Java code given there uses these Apache-hadoop classes:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

But I could not understand where to download these Jars from. On searching internet for these classes I could see they are available here.

But what is the formal/authentic Apache repository for these and Jars?

If jars are shipped along with hadoop, please let me know the path.

EDIT : Other question does not give clear instructions. I found answer as follows

This tutorial mentions:

Download Hadoop-core-1.2.1.jar, which is used to compile and execute the MapReduce program. Visit the following link http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/1.2.1 to download the jar.

So this looks authentic repository.

Upvotes: 1

Views: 22120

Answers (8)

Sanoajul
Sanoajul

Reputation: 287

javac -cp /usr/hdp/2.6.2.0-205/hadoop-mapreduce/:/usr/hdp/2.6.2.0-205/hadoop/:. MyTest.java

Worked for me in CloudxLab.

Upvotes: 0

subtleseeker
subtleseeker

Reputation: 5243

Try compiling using:
javac -cp $(hadoop classpath) MapRTest.java.
In most cases, the files are already present with the downloaded hadoop. For more info, look into this.

Upvotes: 0

Madhusoodan P
Madhusoodan P

Reputation: 667

The best way is download Hadoop (3.x.y) And include the below jars from hadoop-3.x.y/share/hadoop/mapreduce

1. hadoop-common-3.x.y.jar 2. hadoop-mapreduce-client-core-3.x.y.jar

That worked for me!

Upvotes: 2

Priyanka Yemul
Priyanka Yemul

Reputation: 1

If you get such type of error then just type the command on terminal:

export HADOOP_HOME=$(hadoop classath)

note:You have to check for your own hadoop configured name in ./bashrc file. At the time of hadoop installation we set the Hadoop and java path in .bashrc file. We have to Check here in below we can see that next to export .

Upvotes: 0

Lost Carrier
Lost Carrier

Reputation: 123

With current version 2.7.1, I was stumbling at Missing artifact org.apache.hadoop:hadoop-mapreduce:jar:2.7.1, but found out that this jar appears to be split up into various smaller ones.

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.7.1</version>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.7.1</version>
</dependency>

...worked for me (...no clue what this is meant for: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce/2.7.1/ )

Upvotes: 0

Kaushik Lele
Kaushik Lele

Reputation: 6637

This tutorial mentions :

Download Hadoop-core-1.2.1.jar, which is used to compile and execute the MapReduce program. Visit the following link http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/1.2.1 to download the jar.

So here you can find all the jars for different versions

Upvotes: 3

Dan Ciborowski - MSFT
Dan Ciborowski - MSFT

Reputation: 7207

Using NetBeans I create a new Maven project.

Then under project files, I open the pom.xml.

I add inside of

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>0.20.2</version>
    </dependency> 

After building with dependencies I am now ready to code.

Upvotes: 0

Reda
Reda

Reputation: 91

The tutorial you are following uses Hadoop 1.0. Which means the jars that you have and the ones that the tutorial is using is different. If you are using Hadoop 2.X, follow a tutorial that makes use of exactly that version. You don't need to download jars from a third party, you just need to know the proper use of the API of that specific hadoop version.

Upvotes: 0

Related Questions