Reputation: 14066
I get always out of memory error.
my code was:
public static String openAssetFile(Context ctx) {
BufferedReader br=new BufferedReader(new InputStreamReader(ctx.getResources().openRawResource(R.raw.allx)));
StringBuilder text = new StringBuilder();
try {
String line;
while ((line = br.readLine()) != null) {
text.append(line);
text.append('\n');
}
}
catch (IOException e) {
Log.e("IOErr", "IOErr");
}
return text.toString();
}
it is good for little files, but what can i do when i d like to open a bigger file? like 3-6 mb.
thanks
Upvotes: 0
Views: 3489
Reputation: 16363
Solution depends on for what purposes you're reading such a lengthy text. For instance in my case I was facing the same problem for reading large enough text for showing it in multiline TextView
. I have resolved problem developing my own class which implements CharSequence interface. Class was just chunking input file and read it piece by piece.
Upvotes: 1
Reputation: 886
How much memory is the rest of your app using? Your app can use a maximum of 16Mb (or more depending on the device) so loading a 3Mb file shouldn't give you an out of memory limit.
Saying that, I recall that there is a limit about the size of the files you can open that are contained within your app package (~2Mb). What you can do is split up your file into smaller files that are under the limit and then use http://download.oracle.com/javase/1.4.2/docs/api/java/io/SequenceInputStream.html to read those multiple files in as if they were one file.
Alternatively, find a way to preprocess and compress your data so you aren't using so much memory.
Your code looks very inefficient. From experience writing similar code, I'm going to guess it takes about 20s to run. It'll create tons of garbage and all those operations are going to take time. Is there no way you can just read all your data directly in one go with no processing for each entry? e.g. you could preprocess your data, serialise it to an object and then just unserialise this in your app.
I've worked on an app that needed to load a 3Mb dictionary file and the solution for me was to preprocess it into an optimised data structure I needed (bringing the space requirement down to 0.5Mb), serialise that, then just load the object directly on startup.
Upvotes: 1