Reputation: 8913
I have a 50GB JSON file that I want to be read by my Java application, with each record been converted into a POJO and put on a queue, one by one, to be processed by a separate application.
Ordinarily, I'd use a BufferedReader but I'm wondering what the best practices are for files that size. I'm currently looking at Java.nio . Would that be suitable?
I'd also like for the processing of the file to not be stopped if one record is corrupt. I don't want one egg spoiling the basket, so to speak.
Upvotes: 0
Views: 261
Reputation: 101
What you are looking for is streaming deserialization, you could use gson for that, see https://sites.google.com/site/gson/streaming for some good examples.
Upvotes: 2