Vivek
Vivek

Reputation: 41

Processing huge pipe delimited files

With reference to my previous post

Remove first line from a delimited file

I was able to process smaller files and remove the first line .... but incase of huge files there is an issue of memory as I am reading the whole file and then writing it back again.

Can anybody suggest a better alternative to this issue.

Thanks for Advance.

Vivek

Upvotes: 0

Views: 1111

Answers (2)

Peter Lawrey
Peter Lawrey

Reputation: 533660

To avoid rewriting the entire file to remove one line you can maintain an index to the "start" of the file. This index is where you believe the start to be and where you would start reading the file from. Periodically e.g. once a night, you can rewrite the file so that this "start" is where the file actually starts.

This "start" location can be stored in another time or at the start of the existing file.

This means you can progressively "remove" all the lines of a file without re-writing it at all.

Upvotes: 0

AlexR
AlexR

Reputation: 115378

You have to read file line-by-line and write it on place:

BufferedReader reader = new BufferedReader(new FileReader("foo.txt"));
PrintWriter writer = new PrintWriter(new FileWriter("_foo.txt"));

String line;
boolean firstLine = true;

while ( (line = reader.readLine()) !=null) {
    if (!firstLine) {
        writer.println(line);
        firstLine = false;
    }
}

Upvotes: 1

Related Questions