Reputation: 12838
Scenario:
Should I:
I will have to keep the things simple . Any suggestions ?
Upvotes: 5
Views: 2023
Reputation: 2558
Your real value add will be to read each line as a String, which is pretty easy in Java. After it's in a String, it is trivial to split the string on each comma with
String[] row = parsedRow.split(",");
The you will have a String for each value in the array, which can then be operated on.
Upvotes: 1
Reputation: 120771
Java Casts: like
Object a = new String();
String b (String) a;
are not expensive. -- No matter if you cast Strings or any other type.
Upvotes: 2
Reputation: 6149
I'd recommend using a MappedByteBuffer (from NIO), that you can use to read a file too big to fit into memory. It maps only a region of the file into memory; once you're done reading this region (say, the first 10k), map the next one, and so on, until you've read the whole file. Memory-efficient and quite easy to implement.
Upvotes: 3
Reputation: 1108692
Casting doesn't change the amount of memory an object occupies. It just changes the runtime type.
If you can do those operations on a per-row basis, then just do the operation immediately inside the loop wherein you read a single line.
while ((line = reader.readLine()) != null) {
line = process(line);
writer.println(line);
}
This way you effectively end up with only a single line in Java's memory everytime instead of the whole file.
Or if you need to do those operations based on the entire CSV file (i.e., those operations are dependent on all rows), then your most efficient bet is to import the CSV file in a real SQL database and then use SQL statements to alter the data and then export it to CSV file again.
Upvotes: 8