King King
King King

Reputation: 63317

Read large text file in java, infeasible?

I'm using the following method to read a file into a JTextArea:

public void readFile(File file) throws java.io.FileNotFoundException,
                                       java.io.IOException {
   if(file == null) return;
   jTextArea1.setText("");
   try(BufferedReader reader = new BufferedReader(new FileReader(file))){
     String line = "";
     while((line=reader.readLine())!=null){
         jTextArea.append(line + "\n");
     }
   }
}

It works OK with a normal-sized file (a few hundred kilobytes), but when I tested a 30000-line file of 42 MB that Notepad can open in about 5 seconds, my file reader took forever. I couldn't wait for it to finish; I had waited for about 15-20 minutes and it was still working consuming 30% of my CPU usage.

Could you please give me a solution for this? I'm handling with text files only, not binary files, and all I know is using BufferedReader is the best.

Upvotes: 1

Views: 709

Answers (1)

wchargin
wchargin

Reputation: 16027

The problem is likely not in the file reading but the processing. Repeated calls to append are likely to be very inefficient with large datasets.

Consider using a StringBuilder. This class is designed for quickly creating long strings from parts (on a single thread; see StringBuffer for a multi-threaded counterpart).

if(file == null) return;
StringBuilder sb = new StringBuilder();
jTextArea1.setText("");
try(BufferedReader reader = new BufferedReader(new FileReader(file))){
    String line = "";
    while((line==reader.readLine())!=null){
        sb.append(line);
        sb.append('\n');
    }
    jTextArea1.setText(sb.toString());
}

As suggested in the comments, you may wish to perform this action in a new thread so the user doesn't think your program has frozen.

Upvotes: 3

Related Questions