user2163299
user2163299

Reputation: 11

Is not possible write from the map to hadoop file system (HDFS)

I'm trying to write a plain text file in the hadoop file system directly from mapper.

I do it as follows:

public void createFile(Configuration conf) throws IOException{    
    FileSystem fs = FileSystem.get(conf);

    Path filenamePath = new Path(conf.get("mapred.output.dir")+"/_"+conf.get("mapred.task.id"), "tree.txt");    

        try {

      if (fs.exists(filenamePath)) {        
        // remove the file first
        fs.delete(filenamePath);            
      }

      FSDataOutputStream out = fs.create(filenamePath);       
      out.writeUTF("hello, world!");        
      out.close();

    } catch (IOException ioe) {
        System.err.println("IOException during operation: " + ioe.toString());
        System.exit(1);
    }
}

And it does not write anything in Pseudo-distributed mode. However, in stand-alone write perfectly.

Where is the problem?

Upvotes: 1

Views: 426

Answers (1)

dfrankow
dfrankow

Reputation: 21469

I was using Amazon Elastic MapReduce (EMR) and I had to get FileSystem by URI to be able to use files from S3.

FileSystem fs = FileSystem.get(uri, conf);

That might not help you.

Upvotes: 1

Related Questions