QGA
QGA

Reputation: 3192

Can't retrieve files from hadoop hdfs

I am learning how to read/write files from/to hdfs.

This is the code I use for reading:

import java.io.InputStream;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;

public class FileSystemCat {
public static void main (String [] args) throws Exception {

    String uri = "/user/hadoop/file.txt";
    Configuration conf = new Configuration();
    conf.addResource(new Path("/usr/local/hadoop/etc/hadoop/core-site.xml"));
    conf.addResource(new Path("/usr/local/hadoop/etc/hadoop/hdfs-site.xml"));

    FileSystem fs = FileSystem.get(URI.create(uri),conf);

    InputStream in = null;
    try{

        in = fs.open(new Path(uri));
        IOUtils.copyBytes(in, System.out, 4096,false);
    }finally{
        IOUtils.closeStream(in);
    }           
}

}

The file is there

hadoop cluster

However, I get the following when I run my code in eclipse

Exception in thread "main" java.io.FileNotFoundException: File /user/hadoop/file.txt does not exist
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:511)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:724)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:501)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:397)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:137)
at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:339)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
at hadoop.FileSystemCat.main(FileSystemCat.java:22)

I used as path both file:///user/hadoop/file.txt and hdfs:///user/hadoop/file.txt

For the latter the error is slightly different:

Exception in thread "main" java.io.IOException: No FileSystem for scheme: hdfs

core-site.xml

<configuration>
   <property>
     <name>fs.default.name</name>
     <value>hdfs://localhost/</value>
   </property>
</configuration>

hdfs-site.xml

<configuration>
<property>
   <name>dfs.replication</name>
   <value>2</value>
 </property>

 <property>
   <name>dfs.namenode.name.dir</name>
   <value>file:///usr/local/hadoop_store/hdfs/namenode/</value>
 </property>

 <property>
   <name>dfs.datanode.data.dir</name>
   <value>file:///usr/local/hadoop_store/hdfs/datanode/,file:///mnt/hadoop/hadoop_store/hdfs/datanode/</value>
 </property>

 <property>
   <name>dfs.webhdfs.enabled</name>
   <value>true</value>
 </property>
</configuration>

Any concern?

Thanks

Upvotes: 1

Views: 5074

Answers (3)

Kirito
Kirito

Reputation: 31

You should change the line

FileSystem fs = FileSystem.get(URI.create(uri),conf);

for something like this

FileSystem fs = FileSystem.get(URI.create("hdfs://localhost"), conf);

That should work, if your uri path is in the hdfs.

To see if your uri path is in the hdfs, you could do hadoop fs -ls / in a command line

Upvotes: 3

Somum
Somum

Reputation: 2422

If you want to read data a HDFS file then this code will do that.

package com.yp.util;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;


public class ReadHadoopFileData {


public static void main(String[] args) throws IOException {

    Configuration conf = new Configuration();
    FileSystem hdfs = FileSystem.get(conf);

    Path hdfsFile = new Path(args[0]);

    try {
        BufferedReader br=new BufferedReader(new InputStreamReader(hdfs.open(hdfsFile)));
        String line;
        line=br.readLine();
        while (line != null){
                System.out.println(line);
                line=br.readLine();
        }

    }catch (IOException ioe) {
        ioe.printStackTrace();
    }   
  }

}

When you run using command line then all your environmental setting will be taken care by hadoop.

The command to run above program ( suppose you created Read.jar and hdfs file is part-r-00000 )

hadoop jar Read.jar com.yp.util.ReadHadoopFileData /MyData/part-r-00000

Upvotes: 1

Balduz
Balduz

Reputation: 3570

Add the XML files with the HDFS configuration parameters:

Configuration conf = new Configuration();
conf.addResource(new Path("your_hadoop_path/conf/core-site.xml"));
conf.addResource(new Path("your_hadoop_path/conf/hdfs-site.xml"));
FileSystem fs = FileSystem.get(URI.create(uri),conf);

Upvotes: 1

Related Questions