Reputation: 4570
I'm storing results from a SQL query to an Arraylist<Hashtable<String, String>>
, where each result is stored on an Hashtable and then each result is stored on a list
It works fine for if there are less than 100.000 results, buf if there are more results I get and OutOfMemory exception.
Do you hava any suggestions on how I can improove this situation?
Upvotes: 1
Views: 659
Reputation: 221106
The simplest and probably least memory intensive way to improve this is to keep a two-dimensional array for your data:
String[][] data;
// Alternatively, if you prefer lists:
List<String[]> data;
You would then need just one additional data structure, such as
Map<String, Integer> fieldNameToIndexMapping;
... where you'd store a lookup table to access your fields by column index. E.g.
String value = data[row][fieldNameToIndexMapping.get("MY_COLUMN")];
More tuning is possible by using more concise data types, where appropriate. I suspec that some of the String
values that you're storing in your data structure are actually Boolean
, Integer
and other items that would be easier to store in a non-string form.
Upvotes: 0
Reputation: 9579
There are a few different approaches. Stechen C is correct, it would be best to use a 'where' clause in your SQL to reduce the size of your result set.
If you are using caching, you can set a maximum size for the cache, and the best class for that is LinkedHashMap. There is an implementation of Map which keeps the insertion order. You'll need to subclass it and override the removeEldestEntry() method, as indicated in the Javadoc:
Upvotes: 0
Reputation: 719229
You can probably save some space by using a custom class to represent each row. (A HashMap
is a rather "memory hungry" data structure.) However, that is only going to put off the inevitable OOME.
A better idea would be to change you application so that you don't have to store the entire resultset in memory at all.
Upvotes: 1