Vignesh I
Vignesh I

Reputation: 2221

Unable to determine Hadoop version information

I have installed hadoop in ubuntu and its running fine.

ubuntu:/home/hduser/hive-0.10.0-cdh4.3.1$ jps
2702 DataNode
3101 ResourceManager
4879 Jps
2948 SecondaryNameNode
3306 NodeManager

hadoop_version=Hadoop 2.0.0-cdh4.3.0

Then i installed hive(hiv version-hive-0.10.0) from apache tarballs and I tried running bin/hive. But I am getting below error:

Unable to determine Hadoop version information. hadoop version returned:

/home/hduser/hadoop/etc/hadoop /usr/lib/jvm/jdk1.6.0_45/ 
Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a 
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013 
From source with checksum a4218d77f9b12df4e3e49ef96f9d357d 
This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar

I tried it to resolve through my scripting knowledge but can't. When I stretched my effort I found that it is failing in below line:

if [[ "$HADOOP_VERSION" =~ $hadoop_version_re ]]; then

and I tried to echo HADOOP_VERSION it returned nothing and HADOOP_VERSION is defined as

HADOOP_VERSION=$($HADOOP version | awk '{if (NR == 1) {print $2;}}');

and $HADOOP version yields me

 /home/hduser/hadoop/etc/hadoop
 /usr/lib/jvm/jdk1.6.0_45/
 Hadoop 2.0.0-cdh4.3.0
 Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
 Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
 From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
 This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar

I am struck this for a week now. Please help me out. Thanks.

Upvotes: 0

Views: 13275

Answers (6)

Karthik
Karthik

Reputation: 759

If u have set export HADOOP_VERSION=2.0.0-cdh4.3.0 or your version num in the .bashrc file, comment it or put an # in front like #export HADOOP_VERSION=2.0.0-cdh4.3.0 and then run hive, you will be able to solve the issue.

Upvotes: 0

plhn
plhn

Reputation: 5263

Check your JAR PATH (JRE_HOME)

Upvotes: 0

venus
venus

Reputation: 1258

Execute the following command: hadoop version

hduser@ubuntu:/usr/local/hadoop/sbin$ hadoop version

Upvotes: 0

merours
merours

Reputation: 4106

On windows, you may have the same problem.

Indeed, if $HADOOP_HOME is set as a dos path (eg : C:\hadoop), you need to change it in cygwin. One way to do so is to put the following line in your .bashrc :

export HADOOP_HOME="$(cygpath $HADOOP_HOME)"

Upvotes: 0

user3528338
user3528338

Reputation: 81

I had the same issue, I fixed it by including the below in .profile and sourcing it again.

export HADOOP_VERSION="2.0.0-cdh4.2.0"

Upvotes: 1

Sebastian Ertel
Sebastian Ertel

Reputation: 31

The problem is already depicted by your question. When the script executes $HADOOP version then it expects output like this:

Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
 From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
 This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar`

Instead some other output snug in (probably because you modified some scripts in the Hadoop. Check for conf/hadoop-env.sh):

/home/hduser/hadoop/etc/hadoop
/usr/lib/jvm/jdk1.6.0_45/
Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
 From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
 This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar`

Now the awk line does not find the desired number anymore (on position 2).

So the solution is to find out where that extra output comes from and remove it.

Upvotes: 1

Related Questions