Reputation: 1077
I need to write huge files ( more than 1 million lines) and send the file to a different machine where I need to read it with a Java BufferedReader
, one line at a time.
I was using indetned Json format but it turned out to be not very handy,
it requires too much coding and that consumes extra RAM/CPU.
I'm looking for something that looks like this:
client:id="1" name="jack" adress="House N°1\nCity N°3 \n Country 1" age="20"
client:id="2" name="alice" adress="House N°2\nCity N°5 \n Country 2" age="30"
vihecul:id="1" model="ford" hp="250" fuel="diesel"
vihecul:id="2" model="nisan" hp="190" fuel="diesel"
This way I can read the objects one at a time.
I know about url.encode & base64, but I'm trying to keep shorter readable lines.
So any suggestions please!
Upvotes: 0
Views: 318
Reputation: 2937
With the huge files, any textual data formats, specially with the markup data like JSON, YAML or XML, is not a very nice solution.
I can suggest to use a universal binary format, like Google Protocol Buffers or ASN1.
The Google Protocol Buffers is much easy to get started.
Of course if you just need a Java-To-Java data transferring, you can use java out of the box serialization.
Upvotes: 2
Reputation: 1168
What about reading/writing files in binary format using DataInputStream
and DataOutputStream
?
Of course, your data must have fixed structure, but as a benefit you'll get smaller file sizes and faster reading/writing.
Upvotes: 1