agarwav
agarwav

Reputation: 400

Memory Error in Stanford CoreNLP (Eclipse)

package src;

import java.util.Properties;
import edu.stanford.nlp.pipeline.StanfordCoreNLP;

public class NLPTest {
    public static void main(String[] args){
        Properties props = new Properties();
        props.put("annotators", "tokenize, ssplit, pos, lemma, ner, parse, dcoref");
        StanfordCoreNLP coreNLP = new StanfordCoreNLP(props);
    }

}

I ran this sample code in my eclipse but it gives following error: Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

Although everything works perfectly when I run Stanford CoreNLP from Command Propmt. Can anybody tell me the solution? Is it related to memory allocation to Eclipse?

Upvotes: 2

Views: 4323

Answers (5)

moritz.vieli
moritz.vieli

Reputation: 1807

If others stumble across this issue: In my case I could cut memory consumption in half or even more just by upgrading from Java 1.8 to Java 14. Seems like the used Java version has a heavy impact.

Upvotes: 0

demongolem
demongolem

Reputation: 9708

I thought I should go along with Chris to give an answer (that is some memory values) which is specific to this problem (Core-NLP) rather than just some standard Java with Eclipse guidance.

If you are doing -Xmx1500m, that is likely not enough. The numbers mentioned in the other numbers, which admittedly are meant just to be examples, are not enough. If I run with -Xmx3500m, that is good enough to allow the Coreference Resolution part of the pipeline to pass. As one familiar with Eclipse can tell, that is the area where 64-bit is required (as Chris says) and Eclipse won't let you allocate that much heap memory if you have chosen 32-bit tools.

Upvotes: 1

Christopher Manning
Christopher Manning

Reputation: 9450

The Eclipse problem is that you need to set not the amount of memory that Eclipse gets (the eclipse.ini file) but rather the amount of memory that a Java program run from Eclipse gets. This is specified in Run|Run Configurations as detailed in other stack overflow answers.

But also, are you running with a 32 bit JVM? You may well need to be running with a 64 bit JVM to be able to allocate enough memory for Stanford CoreNLP to run happily.

Upvotes: 2

aravindKrishna
aravindKrishna

Reputation: 440

For Eclipse you have Eclipse.ini file near to Eclipse.exe

-Xmn128m
-Xms256m
-Xmx768m
-Xss1m
-XX:PermSize=128m
-XX:MaxPermSize=384m

Here change the heap size Then your program won't OOM

Upvotes: 1

aravindKrishna
aravindKrishna

Reputation: 440

An OOM or OOME (OutOfMemoryError) simply means that the JVM ran out of memory. When this occurs, you basically have 2 choices:

1.Allow the JVM to use more memory using the -Xmx VM argument. For instance, to allow the JVM to use 1 GB (1024 MB) of memory:

java -Xmx1024m HelloWorld

2.Improve/Fix the application so that it uses less memory

     Start the application with the VM argument -XX:+HeapDumpOnOutOfMemoryError. This will tell the VM to produce a heap dump when a OOM occurs:
java -XX:+HeapDumpOnOutOfMemoryError ...

I suggest you to run like this in your command prompt

java -Xms64m -Xmx256m HelloWorld

here -Xms64m minimum heap size 64mb and -Xmx256m maximum heap size 256mb instead Helloworld put your classname

Upvotes: 0

Related Questions