Samuel Bushi
Samuel Bushi

Reputation: 341

OutOfMemoryError on tomcat7

I am developing a web-app which takes a zip file, uploaded by the user, unzips it on the server, and process the files. It works like a charm when the zip file is not too large (20-25MB) but if the file is about or over (50MB), it produces the OutOfMemoryError.

I have tried to increase the java maximum memory allocation pool by adding export CATALINA_OPTS="-Xmx1024M" to startup.sh in tomcat7, but the error still persists.

AFAIK, the problem is in unzipping the .zip file. top shows that tomcat uses 800MB of memory during the extraction of the 50MB file. Is there any solution, to enable upto ~200MB uploads, whilst efficiently using the available memory?

The code for unzipping is as follows:

package user;

import java.io.BufferedInputStream;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.zip.ZipEntry;
import java.util.zip.ZipInputStream;

public class unzip {

public void unzipFile(String filePath, String oPath)
{

    FileInputStream fis = null;
    ZipInputStream zipIs = null;
    ZipEntry zEntry = null;
    try {
        fis = new FileInputStream(filePath);
        zipIs = new ZipInputStream(new BufferedInputStream(fis));
        while((zEntry = zipIs.getNextEntry()) != null){
            try{
                byte[] tmp = new byte[8*1024];
                FileOutputStream fos = null;
                String opFilePath = oPath+zEntry.getName();
                System.out.println("Extracting file to "+opFilePath);
                fos = new FileOutputStream(opFilePath);
                int size = 0;
                while((size = zipIs.read(tmp)) != -1){
                    fos.write(tmp, 0 , size);
                }
                fos.flush();
                fos.close();
            }catch(Exception ex){

            }
        }
        zipIs.close();
        fis.close();
    } catch (FileNotFoundException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
}
}

The error code is as follows:

HTTP Status 500 - javax.servlet.ServletException:      java.lang.OutOfMemoryError: Java heap space

type Exception report

message javax.servlet.ServletException: java.lang.OutOfMemoryError: Java heap space

description The server encountered an internal error that prevented it from fulfilling this request.

exception

org.apache.jasper.JasperException: javax.servlet.ServletException: java.lang.OutOfMemoryError: Java heap space
    org.apache.jasper.servlet.JspServletWrapper.handleJspException(JspServletWrapper.java:549)
    org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:455)
    org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:390)
org.apache.jasper.servlet.JspServlet.service(JspServlet.java:334)
javax.servlet.http.HttpServlet.service(HttpServlet.java:727)

root cause

javax.servlet.ServletException: java.lang.OutOfMemoryError: Java heap space
    org.apache.jasper.runtime.PageContextImpl.doHandlePageException(PageContextImpl.java:916)
    org.apache.jasper.runtime.PageContextImpl.handlePageException(PageContextImpl.java:845)
    org.apache.jsp.Upload_jsp._jspService(Upload_jsp.java:369)
org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:432)
org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:390)
org.apache.jasper.servlet.JspServlet.service(JspServlet.java:334)
javax.servlet.http.HttpServlet.service(HttpServlet.java:727)

root cause

java.lang.OutOfMemoryError: Java heap space
    org.apache.commons.io.output.ByteArrayOutputStream.toByteArray(ByteArrayOutputStream.java:322)
    org.apache.commons.io.output.DeferredFileOutputStream.getData(DeferredFileOutputStream.java:213)
    org.apache.commons.fileupload.disk.DiskFileItem.getSize(DiskFileItem.java:289)
org.apache.jsp.Upload_jsp._jspService(Upload_jsp.java:159)
org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:432)
org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:390)
org.apache.jasper.servlet.JspServlet.service(JspServlet.java:334)
javax.servlet.http.HttpServlet.service(HttpServlet.java:727)

note The full stack trace of the root cause is available in the Apache Tomcat/7.0.52 (Ubuntu) logs.
Apache Tomcat/7.0.52 (Ubuntu)

Surprisingly, there was nothing on the catalina.out file regarding this exception.

Thanks in advance.

EDIT Code for DiskFileItem in Upload.jsp

//necessary imports go here
File file ;
int maxFileSize = 1000 * 1000 * 1024;
int maxMemSize = 1000 * 1024;
ServletContext context = pageContext.getServletContext();
String filePath = context.getInitParameter("file-upload");
String contentType = request.getContentType();
if(contentType != null)
{
  if ((contentType.indexOf("multipart/form-data") >= 0)) 
  {
  DiskFileItemFactory factory = new DiskFileItemFactory();
  factory.setSizeThreshold(maxMemSize);
  factory.setRepository(new File("/tmp/"));
  ServletFileUpload upload = new ServletFileUpload(factory);
  upload.setSizeMax( maxFileSize );
  try{ 
     List fileItems = upload.parseRequest(request);
     Iterator i = fileItems.iterator();
     while (i.hasNext ()) 
     {

        FileItem fi = (FileItem)i.next();
        if ( !fi.isFormField () )   
        {
           String fieldName = fi.getFieldName();
           String fileName = fi.getName();
           if(fileName.endsWith(".zip")||fileName.endsWith(".pdf")||fileName.endsWith(".doc")||fileName.endsWith(".docx")||fileName.endsWith(".ppt")||fileName.endsWith(".pptx")||fileName.endsWith(".html")||fileName.endsWith(".htm")||fileName.endsWith(".epub")||fileName.endsWith(".djvu"))
           {
              boolean isInMemory = fi.isInMemory();
              long sizeInBytes = fi.getSize();            
              new File(filePath+fileName).mkdir();
              filePath = filePath+fileName+"/";
              file = new File( filePath + fileName.substring( fileName.lastIndexOf("/"))) ;
              fi.write(file);
              String fileExtension = FilenameUtils.getExtension(fileName);
              if(fileExtension.equals("zip"))
              {
                 System.out.println("In zip.");
                 unzip mfe = new unzip();
                 mfe.unzipFile(filePath+fileName,filePath);
                 File zip = new File(filePath+fileName);
                 zip.delete();
              }
              File corePath = new File(filePath);
              int count=0;
           //some more processing
           }
        }
     }
  }
  catch(Exception e)
  {
     //exception handling goes here      
}
  }
}

Upvotes: 2

Views: 1453

Answers (4)

Svetlin Zarev
Svetlin Zarev

Reputation: 15673

The issue is not in the unzip code you had posted. the root couse is in:

java.lang.OutOfMemoryError: Java heap space
    org.apache.commons.io.output.ByteArrayOutputStream.toByteArray(ByteArrayOutputStream.java:322)
    org.apache.commons.io.output.DeferredFileOutputStream.getData(DeferredFileOutputStream.java:213)
    org.apache.commons.fileupload.disk.DiskFileItem.getSize(DiskFileItem.java:289)

Do you notice the ByteArrayOutputStream.toByteArray ? So it seems that you are writing to a ByteArrayOutputStream which grows too much. Please locate and post the code which uses this ByteArrayOutputStream, as your zip code does not use such thing


Update: From the code you've posted it seems that your code is ok. But the FileItem.getSize() call does some nasty things:

283   public long getSize() {
284        if (size >= 0) {
285            return size;
286        } else if (cachedContent != null) {
287            return cachedContent.length;
288        } else if (dfos.isInMemory()) {
289            return dfos.getData().length;
290        } else {
291            return dfos.getFile().length();
292        }
293    }

If the file item's data is stored in memory - it calls getData() which calls toByteArray()

209    public byte[]  [More ...] getData()
210    {
211        if (memoryOutputStream != null)
212        {
213            return memoryOutputStream.toByteArray();
214        }
215        return null;
216    }

Which in turn allocates a new array:

317    public synchronized byte[] toByteArray() {
318        int remaining = count;
319        if (remaining == 0) {
320            return EMPTY_BYTE_ARRAY; 
321        }
322        byte newbuf[] = new byte[remaining];
           //Do stuff
333        return newbuf;
334    }

So for a short time you have twice the normal memory consumption.

I would recommend you to:

  1. Set the maxMemSize to no-more 8-32 Kb

  2. Give more memory to the JVM process: -Xmx2g for example

  3. Make sure that you are not holding unnecessary any references to FileItems as in your current configuration they consume a lot of memory.

  4. If OOM happens again take a heapdump. You can use the -XX:+HeapDumpOnOutOfMemoryError JVM flag to automatically create a heapdump for you. Then you can use a heap dump analyzer (for instance Eclipse MAT) to check who is allocating so much memory and where it is being allocated.

Upvotes: 2

TheCodingFrog
TheCodingFrog

Reputation: 3514

The issue is when user is uploading a zip file, the entire zip file getting read in memory, from stack trace the error is thrown while making a call to

DiskFileItem.getSize()

From source code of DiskFileItem, DiskFileItem.getSize() is getting all the data first,

public long getSize() {
284        if (size >= 0) {
285            return size;
286        } else if (cachedContent != null) {
287            return cachedContent.length;
288        } else if (dfos.isInMemory()) {
289            return dfos.getData().length;
290        } else {
291            return dfos.getFile().length();
292        }
293    }

By looking at documentation of DeferredFileOutputStream.getDate()

Returns either the output file specified in the constructor or the temporary file created or null.
If the constructor specifying the file is used then it returns that same output file, even when threashold has not been reached.
If constructor specifying a temporary file prefix/suffix is used then the temporary file created once the threashold is reached is returned If the threshold was not reached then null is returned.

Returns:
    The file for this output stream, or null if no such file exists.

Idealy user should not be allowed to upload a file of any size, there should be a max size limit given your server capacity.

Upvotes: 1

a_z
a_z

Reputation: 352

It seems like your while loop is making too much memory creation.

Check the number of times that it occurs to decide.

Main this line below is the cause:

byte[] tmp = new byte[8*1024];

You can try to reduce 1024 to something like 10 and see if it's still happeneds.
Also check the file size.

Upvotes: 0

defectus
defectus

Reputation: 1987

Allocating 8MB for each zip entry seems to be just a finger in the air approach. Try to use smaller buffers, say no more than 1kb. Garbage collection doesn't oocur continuously.

Try to use this approach:

int BUFFER_SIZE = 1024;
int size;
byte[] buffer = new byte[BUFFER_SIZE];

...
FileOutputStream out = new FileOutputStream(path, false);
BufferedOutputStream fout = new BufferedOutputStream(out, BUFFER_SIZE);

while ( (size = zin.read(buffer, 0, BUFFER_SIZE)) != -1 ) {
   fout.write(buffer, 0, size);
}

Upvotes: 0

Related Questions