user2293963
user2293963

Reputation: 1

java process hangs while launching an executible that is using the file system

I'm trying to execute 'edena' bioinformatic program within my java code. The process uses input files and write to output files. when the input files are small (~1 MB) the process finishes and exits perfectly. when the input files are larger (~ 80 MB) the process just hangs. Invoking the process from cmd works just fine, so I suspect it got something to do with buffers etc.. I'm working on ubuntu 12.04.10 with 4GB RAM (don't know if it is relevant). this is the code hanging:

String edena_exe1 = "edena -M 75 -p " + workshopDir + BinAssembly.cliqueFilesDir +         
"clique_" + c.getId() + " -DRpairs "+ workshopDir + BinAssembly.cliqueFilesDir +  
"/clique" + c.getId() + "pair1.fna " + workshopDir + 
BinAssembly.cliqueFilesDir + "/clique" + c.getId() + "pair2.fna ";
Process edena_proc1 = Runtime.getRuntime().exec(edena_exe1);
edena_proc1.waitFor();  

thanks!

Upvotes: 0

Views: 379

Answers (1)

OldCurmudgeon
OldCurmudgeon

Reputation: 65803

I suspect with the larger input file the process generates more output.

When a process is started by the JVM it is given a limited buffered stream for output. If you do not bleed that stream while the process is running it may eventually fill up and block.

private static void dir() throws IOException {
  Runtime r = Runtime.getRuntime();
  Process p = r.exec("DIR C:\\ /S");
  BufferedReader br = new BufferedReader(new InputStreamReader(p.getInputStream()));
  try {
    String line;
    // Bleed the output.
    while ((line = br.readLine()) != null) {
      System.out.println(line);
    }
  } finally {
    br.close();
  }
  // Just in case.
  p.destroy();
}

Upvotes: 1

Related Questions