sunny
sunny

Reputation: 1945

Getting Exception on "hadoop fs -ls /"

I run hadoop-2.0.5-alpha. When I list hdfs files, I get this Exception:

bin/hadoop fs -ls /
13/07/07 18:47:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status;

My core-site.xml looks like that:

<configuration>
  <property>
     <name>fs.defaultFS</name>
     <value>hdfs://master:8020</value>
  </property>
</configuration>

What could be wrong?

Upvotes: 0

Views: 2870

Answers (1)

Drew
Drew

Reputation: 36

If you have multpile versions of hadoop installed on your system, verify your PATH. You may be using the wrong version of hadoop as the client.

I ran into this problem when I had two versions of hadoop installed: hadoop-1.1.2 and hadoop-2.1.0-beta. It turned out that my path was incorrect and I was attempting to run the hadoop command from hadoop-1.1.2 against hadoop 2.1.0-beta.

In addition to your PATH, check the settings of your HADOOP_CONF_DIR or even HADOOP_HOME environment variables to be sure they are pointing to the correct directory for your hadoop 2 installation.

Upvotes: 1

Related Questions