Reputation: 13
I have written a Java program, to read from a txt file line by line, find a certain value in a line, edit it and write all lines to a new file. e.g:
Input:
4563,9876,abc545
Output:
4563,9876_1,abc545
I am running the program from my command prompt, and I am able to treat 1 Million records. But if I treat a bit more, I get the below error:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
I tried to sort it out, but without success. Below is my Java class, can I get some suggestions on how to ameliorate my code to treat more records?
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileWriter;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.Arrays;
class RecordTreatment {
public static void main(String args[]) throws IOException
{
// Open the file
File file = new File("C:\\Users\\tolen\\Desktop\\test.txt");
FileInputStream fstream = new FileInputStream(file);
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
StringBuilder fileContent = new StringBuilder();
String strLine;
int counter=1;
//Read File Line By Line
while ((strLine = br.readLine()) != null) {
String tokens[] = strLine.split(",");
if (tokens.length > 0) {
String tokens1[] = tokens[16].split("\"");
tokens[16] ="\""+tokens1[1] + "_"+counter+++"\"";
for (int i = 0; i < tokens.length; i++) {
if ( tokens[i].equals(tokens[tokens.length-1])) {
fileContent.append(tokens[i]);
}else{
fileContent.append(tokens[i]+",");
}
}
fileContent.append("\n");
}
}
FileWriter fstreamWrite = new FileWriter("C:\\Users\\tolen\\Desktop\\test1.txt");
BufferedWriter out = new BufferedWriter(fstreamWrite);
out.write(fileContent.toString());
out.close();
//Close the input stream
br.close();
}
}
Upvotes: 1
Views: 211
Reputation: 410
class RecordTreatment {
public static void main(String args[]) throws IOException
{
// Open the file
File file = new File("C:\\Users\\tolen\\Desktop\\test.txt");
FileInputStream fstream = new FileInputStream(file);
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
FileWriter fstreamWrite = new FileWriter("C:\\Users\\tolen\\Desktop\\test1.txt");
BufferedWriter out = new BufferedWriter(fstreamWrite);
String strLine;
int counter=1;
//Read File Line By Line
while ((strLine = br.readLine()) != null) {
String tokens[] = strLine.split(",");
if (tokens.length > 0) {
String sub_tokens[] = tokens[16].split("\"");
tokens[16] = String.format("\"{}_{}",sub_tokens[1],counter);
out.write(String.join(",",tokens));
out.write("\n");
}
}
out.close();
br.close();
}
}
Upvotes: 1
Reputation: 5055
Create StringBuilder
inside while
loop and write it into the file in every loop and close the connection to BufferedWriter out
after the while
loop.
This is happening because StringBuilder
object you have is storing all the data in the file while looping. Here you have answer to a similar question. Look at it too.
Upvotes: 0
Reputation: 1037
You're running out of memory because you're reading the entire file into that StringBuilder at once. Instead, could you try appending to your output file as you're reading in the lines of the input file in that while loop?
Upvotes: 0