Reputation: 9206
I'm debugging and fixing a complex app, that works with a huge Java object (~250M).
I've created this object with another program. Currently I use XStream to load and save this object from the hard drive, but it takes more than a minute to parse it. It slows down the development process.
Is JAXB faster? Are there any other ways to load and save this huge thing?
Upvotes: 2
Views: 3292
Reputation: 533530
In that case I would Serialize the data which will make it smaller and faster. You can Externalise key classes to improve the speed further. Here are some test I did more recently Protobuf vs Thrift vs Java Serialization Its the same benchmark as kovica suggest but run more recently on newer hardware/software.
If you need to go faster you can use memory mapped data. This is much harder to work with but you can't beat the performance. You can load 250 MB is tens of milli-seconds if the data is already in disk cache. If not you will be limited by the speed of your drive e.g. a slow 40 MB/s drive will take about 8 seconds.
A library I wrote uses memory mapped files is Java Chronicle
Upvotes: 2
Reputation: 5654
I would recommend Java serialization to others. It can take the advantage of reusing the objects in heap (eg: String's intern() method) instead of creating a new objects (like other serializers generally do). This reduces the total amount of payload.
Upvotes: 1
Reputation: 2583
I've never used XStream, but according to their documentation it produces XML, right? Do you need to store those objects in XML? If you just need to store object to disk then there are couple of different approaches you can use:
Look also here: High performance serialization: Java vs Google Protocol Buffers vs ...?
Upvotes: 2