Shiv
Shiv

Reputation: 4567

How to resolve java.lang.OutOfMemoryError error by "java.lang.String", loaded by "<system class loader>" Eclipse Memory Analyzer

I am Reading Some Large XML files and Storing them into Database. It is arond 800 mb.

It stores many records and then terminates and gives an exception:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at java.util.IdentityHashMap.resize(Unknown Source)
    at java.util.IdentityHashMap.put(Unknown Source)

Using Memory Analyzer i have created .hprof files which says:

  76,581 instances of "java.lang.String", loaded by "<system class loader>" occupy 1,04,34,45,504 (98.76%) bytes. 

Keywords
java.lang.String

I have setters and getters for retrieving values.How do i resolve this issue. Any help would be appreaciated.

enter image description here

I have done with increasing memory through JRE .ini. but problem doesn't solved

EDIT: I am using scireumOpen to read XML files.

Example code i have used:

public void readD() throws Exception {

        XMLReader reader = new XMLReader();

        reader.addHandler("node", new NodeHandler() {

            @Override
            public void process(StructuredNode node) {
                try {



                    obj.setName(node
                            .queryString("name"));

                    save(obj);

                } catch (XPathExpressionException xPathExpressionException) {
                    xPathExpressionException.printStackTrace();
                } catch (Exception exception) {
                    exception.printStackTrace();
                }
            }
        });

        reader.parse(new FileInputStream(
                "C:/Users/some_file.xml"));

    }

    public void save(Reader obj) {

        try {
            EntityTransaction entityTransaction = em.getTransaction();
            entityTransaction.begin();
            Entity e1=new Entity;
            e1.setName(obj.getName());

            em.persist(e1);
            entityTransaction.commit();

        } catch (Exception exception) {
            exception.printStackTrace();
        }
    }

Upvotes: 3

Views: 2833

Answers (8)

Freak
Freak

Reputation: 6883

Don't use String if you are using.Replace it with StringBuffer or StringBuilder .Also, try to increase the memory.I guess 2048 is OK but if still issue persist, then change it to 4096m Or even try with 6000m

Upvotes: 1

Shiv
Shiv

Reputation: 4567

Finally i have solved my problem. Following things helped:

1. Heap Size 2048 is eough.

2. Another problem was that i was using String.

and String object is immutable

By immutable, we mean that the value stored in the String object cannot be changed. Then the next question that comes to our mind is “If String is immutable then how am I able to change the contents of the object whenever I wish to?” . Well, to be precise it’s not the same String object that reflects the changes you do. Internally a new String object is created to do the changes.

refer Difference between string and stringbuffer, Stringbuilder

So i removed getters and setters for entities other than JPA Entities. And inserted all data directly to Database without setting them to any objects.

3. the third and the Main problem was JPAEntityManager.

My code didn't ensure the EntityManager is always closed when the method finishes. As far as a RuntimeException occurs in the business logic, the em EntityManager remains open!

So Always close this and also you can set your objects to null in finally block like

finally {
                    Obj1 = null;
                    Obj2 = null;
                    if (entityTransaction.isActive())
                        entityTransaction.rollback();
                    em.clear();
                    em.close();

                }

refer How to close a JPA EntityManger in web applications

+1 for every Answer guys it helped me a lot. i am not marking any answer because i thought of posting the complete answer for it.Thanx

Upvotes: 0

user1516873
user1516873

Reputation: 5203

Looks like you edit code before post it, or post not exactly right code. Please correct it.

First, your code will not compiles.

Second, not pass Reader in save function. Create and fill Entity in process(StructuredNode node) and pass Entity, not Reader, to save function.

Third, correctly handle Exception in save function. If exception occurs, rollback transaction.

Upvotes: 0

Uwe Plonus
Uwe Plonus

Reputation: 9974

Try using another parser for XML processing.

Processing one big XML file with 800M using e.g. DOM is not feasible as it takes up really much memory.

Try using SAX ot StAX in Java and process the parsing results at once without trying to load the complete XML file into memory.

And also don't keep the parsing result in memory in total. Write them as fast as possible into the database and scope your parsing results as narrow as possible.

Perhaps use intermediate tables in database and do the processing part on all datasets inside the database.

Upvotes: 5

Joop Eggen
Joop Eggen

Reputation: 109613

My main tip: check your JPA code once again. Should be as isolated as possible.

An idea would be to use JAXB with annotations. An IdentityHashMap (keys use == instead of equals) is a rare thing, likely JPA, maybe XML tags? You could also look at which XML parser is used (inspect the factory class, or list all XML parser providers by the java SPI, service provider interface).

You could share strings, for instance all strings with length lesser 20. Using a Map<String, String>.

private Map<String, String> sharedStrings = new HashMap<>();

private String shareString(String s) {
    if (s == null || s.length() > 20) {
        return s;
    }
    String t = sharedStrings.get(s);
    if (t == null) {
        t = s;
        sharedStrings.put(t, t);
    }
    return t;
}

public void setXxx(String xxx) {
    this.xxx = sharedString(xxx);
}

You could use compression (GZip streams) for larger texts in the beans.

Upvotes: 1

Devolus
Devolus

Reputation: 22094

  1. The most obvious answer, increase your JVM memory, as already has been mentioned here, using java -XmxNN
  2. Use a SAXParser instead of a DOM Tree (if you don't do this already). This depends on your application design, so you have to look into it and see if this is a possible strategy.
  3. Check your code and try to remove all objects which are not needed, so that they can removed from the GB. This can include i.e. moving variables inside a loop instead of having them outside of it, so that the references are removed early. Setting unused elements to null after you no longer need them.

Without knowing your code, this are only general guidlines.

Upvotes: 1

DaoWen
DaoWen

Reputation: 33029

You can increase your heap size when you launch Java:

java -Xmx8G

Upvotes: 0

Juned Ahsan
Juned Ahsan

Reputation: 68715

Your heap is not limited and cannot hold such a big xml in memory. Try to increase the heap size using -Xmx JRE options.

or

try to use http://vtd-xml.sourceforge.net/ for faster and lighter xml processing.

Upvotes: 2

Related Questions