kostas
kostas

Reputation: 63

Append data to file in hadoop using java api

I have created a file with the results of a sequence of map-reduce jobs. The program i' ve made iteratively outputs some results. I want to append those data in the result file using the java api. I have tried fs.append but it doesn't work. For the time being I am using built-in libraries of java (eclipse 4.2.2) and when I'm ok with the debugin I'll make it a jar and throw it in a cluster.

First of all, is "append" accepted in hdfs? And if yes can anyone tell me how it's done? Thnx in advance.

The code I am using to do this job is the following:

try{
    Path pt = new Path("/home/results.txt");
    FileSystem fs = FileSystem.get(new Configuration());
    BufferedWriter br = new BufferedWriter(new OutputStreamWriter(fs.append(pt)));
    String line = "something";
    br.write(line);
    br.close();
} catch (Exception e) {
    System.out.println("File not found");
}

Upvotes: 0

Views: 3816

Answers (1)

Rishi Dwivedi
Rishi Dwivedi

Reputation: 928

Early versions of HDFS had no support for an append operation. Once a file was closed, it was immutable and could only be changed by writing a new copy with a different filename.

see more information here

if you using old version this is work for me ......

 BufferedReader bfr=new BufferedReader(new InputStreamReader(hdfs.open(path)));     //open file first
            String str = null;
            BufferedWriter br=new BufferedWriter(new OutputStreamWriter(hdfs.create(path,true))); 
            while ((str = bfr.readLine())!= null)
            {
                br.write(str); // write file content
                br.newLine();
               System.out.println("   ->>>>>  "+str);
             }
            br.write("Hello     ");  // append into file
            br.newLine();
            br.close(); // close it

Upvotes: 2

Related Questions