Reputation: 602
I'm trying to compress files inside a directory using java FileSystem. It's working fine when there are only few files, but it fails when there are more than 100 files.
This is the code which I used in my program:
Map<String, String> env = new HashMap<>();
env.put("create", "true");
URI uri = URI.create("jar:file://10.0.8.31/Shared/testFile.zip");
long bytesRead = 0;
File dir = new File("D:\\Shared\\DPXSequence");
try (FileSystem zipfs = FileSystems.newFileSystem(uri, env)) {
for (File sourceF : dir.listFiles()) {
Path externalFile = Paths.get(sourceF.getAbsolutePath());
Path pathInZipfile = zipfs.getPath("/" + sourceF.getName());
// copy a file into the zip file
Files.copy(externalFile, pathInZipfile, StandardCopyOption.REPLACE_EXISTING);
}
}
catch(Exception e) {
System.out.println("Error : "+e.toString());
}
This is the error which I'm getting:
Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space
Where I'm doing wrong?
I think Files.copy()
execution completing before it actually compressed and copied to the destination folder. Is that causing the issue?
Upvotes: 1
Views: 577
Reputation: 697
Either the file you are trying to copy is to big or for whatever reason System.gc() is not being activated and clearing up the memory used.
So check the following:
Input [System.gc();] after the files.copy in the method
OR
Run the java program with more memory allocated to it:
java -Xmx1024M -Xms1024M -jar jarName.jar (If its a jar) Xmx is the max amount you want to allocate (in MB with the M) and the Xms is the initial amount. You can replace the 1024 with anything you want just don't exceed your RAM on your computer.
Upvotes: 1