Reputation: 660
I am working on on Java web application and in our application user can upload his csv files and than we allow them to create charts, table etc in our dashboard.
But when user uploads huge csv files above 500mb and uses it with multiple elements like charts , tables, kpis etc. it uses too much memory and slow down machine.
Now each element have different aggregation and grouping so multiple requests get executed for same file and system goes out of memory and slow down.
Can any one give some idea how can we manage this faster and with out going out of memory, I have 16gb of Ram. How to manage multiple request like this.
For reading csv file I am using "univocity-parsers" which is quite faster and for aggregation we doing it with hazelcast jet.
CsvParserSettings settings = new CsvParserSettings();
CsvParser parser = new CsvParser(settings);
IterableResult<String[], ParsingContext> rows = parser.iterate(new FileReader(csvPath));
ResultIterator<String[], ParsingContext> rowIterator = rows.iterator();
while(rowIterator.hasNext()) {
String[] values = rowIterator.next();
if (headers[0] == null || headers.length == 0) {
String[] ln= rowIterator.next();
if (headers[0] == null) {
headers[0] = ln;
}
}
Map<String, String> map = new HashMap<>();
for (int i = 0; i < headers[0].length; i++) {
String value = values[i];
map.put(headers[0][i], value);
}
}
}
I am just provide sample code as whole code is quite long. Just need idea or mechanism which can increase performance.
Upvotes: 1
Views: 166