Reputation: 325
The following code reads a bunch of .csv files and then combines them into one .csv file. I tried to system.out.println
... all datapoints are correct, however when i try to use the PrintWriter
I get:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
.
I tried to use FileWriter
but got the same error. How should I correct my code?
public class CombineCsv {
public static void main(String[] args) throws IOException {
PrintWriter output = new PrintWriter("C:\\User\\result.csv");
final File file = new File("C:\\Users\\is");
int i = 0;
for (final File child: file.listFiles()) {
BufferedReader CSVFile = new BufferedReader( new FileReader( "C:\\Users\\is\\"+child.getName()));
String dataRow = CSVFile.readLine();
while (dataRow != null) {
String[] dataArray = dataRow.split(",");
for (String item:dataArray) {
System.out.println(item + "\t");
output.append(item+","+child.getName().replaceAll(".csv", "")+",");
i++;
}
dataRow = CSVFile.readLine(); // Read next line of data.
} // Close the file once all data has been read.
CSVFile.close();
}
output.close();
System.out.println(i);
}
}
Upvotes: 0
Views: 2168
Reputation: 1482
Try
boolean autoFlush = true;
PrintWriter output = new PrintWriter(myFileName, autoFlush);
It creates a PrintWriter instance which flushes content everytime when there is a new line or format.
Upvotes: 0
Reputation: 718798
I can only think of two scenarios in which that code could result in an OOME:
If the file
directory has a very large number of elements, then file.listFiles()
could create a very large array of File
objects.
If one of the input files includes a line that is very long, then CSVFile.readLine()
could use a lot of memory in the process of reading it. (Up to 6 times the number of bytes in the line.)
The simplest approach to solving both of these issues is to increase the Java heap size using the -Xmx
JVM option.
I can see no reason why your use of a PrintWriter
would be the cause of the problem.
Upvotes: 1