Reputation: 2542
I am trying to write a huge file, between 500mb to 1.5 gb, into the disk.
i used zipOutputStream
to zip it then send it over the network
but in client, when im trying to unzip and
write it, it gives me Java heap space.
exception
try (FileOutputStream fos = new FileOutputStream(outFile)) {
int fileLength = (int)zipEntry.getSize();
byte[] fileByte = new byte[fileLength];
zips.read(fileByte);
fos.write(fileByte);
}
i know thats quite big memory allocation for an array of bytes, but how possibly i can fix it?
Upvotes: 2
Views: 5835
Reputation: 44414
Instead of reading the entire file into memory all at once, you can accomplish the same thing with Files.copy(zips, outFile.toPath());
. Documentation for the Files.copy method is here.
Upvotes: 2
Reputation: 11440
the byte[]
array you are making is your buffer, the buffer serves as the temporary location in the heap while in transit from your InputStream
to your OutputStream
. Unless you want your program to use 500mb - 1.5gb in memory you need to reduce your buffer size. Here is a common method I use for doing this operation. This method uses a 1kb buffer, you can play with the size and see what suits you best.
/**
* writes all bytes from inputStream to outputStream
* @param source
* @param destination
* @throws IOException
*/
public static void pipeStreams(java.io.InputStream source, java.io.OutputStream destination) throws IOException {
// 1kb buffer
byte [] buffer = new byte[1024];
int read = 0;
while((read=source.read(buffer)) != -1) {
destination.write(buffer, 0, read);
}
destination.flush();
}
Your code using this method would look something like
try (FileOutputStream fos = new FileOutputStream(outFile)) {
pipeStreams(zips, fos);
}
Upvotes: 4