viki.omega9
viki.omega9

Reputation: 335

Using Stanford CoreNLP

I am trying to get around using the Stanford CoreNLP. I used some code from the web to understand what is going on with the coreference tool. I tried running the project in Eclipse but keep encountering an out of memory exception. I tried increasing the heap size but there isnt any difference. Any ideas on why this keeps happening? Is this a code specific problem? Any directions of using CoreNLP would be awesome.

EDIT - Code Added

import edu.stanford.nlp.dcoref.CorefChain;
import edu.stanford.nlp.dcoref.CorefCoreAnnotations;
import edu.stanford.nlp.pipeline.Annotation;
import edu.stanford.nlp.pipeline.StanfordCoreNLP;


import java.util.Iterator;
import java.util.Map;
import java.util.Properties;


public class testmain {

    public static void main(String[] args) {

        String text = "Viki is a smart boy. He knows a lot of things.";
        Annotation document = new Annotation(text);
        Properties props = new Properties();
        props.put("annotators", "tokenize, ssplit, pos, parse, dcoref");
        StanfordCoreNLP pipeline = new StanfordCoreNLP(props);
        pipeline.annotate(document);


        Map<Integer, CorefChain> graph = document.get(CorefCoreAnnotations.CorefChainAnnotation.class);



        Iterator<Integer> itr = graph.keySet().iterator();

        while (itr.hasNext()) {

             String key = itr.next().toString();

             String value = graph.get(key).toString();

             System.out.println(key + " " + value);      
        }

   }
}

Upvotes: 7

Views: 9426

Answers (3)

user1374131
user1374131

Reputation: 131

Fix for eclipse: You can configure this in eclipse preference as follows

  1. Windows -> Preferences ( on mac it's: eclipse ->preferences)
  2. Java -> Installed JREs
  3. Select the JRE and click on Edit
  4. On the default VM arguments field, type in "-Xmx1024M". (or your memory preference, for 1GB of ram its 1024)
  5. Click on finish or OK.

Upvotes: 3

Akash
Akash

Reputation: 119

I think you can define the heap size in right-click->run->run-configurations under the VM arguments. i have tested it on mac and it works.

Upvotes: 2

Khairul
Khairul

Reputation: 1483

I found similar problem when building small application using Stanford CoreNLP in Eclipse.
Increasing Eclipse's heap size will not solve your problem.
After doing search, it is ant build tool heap size that should be increased, but I have no idea how to do that.
So I give up Eclipse and use Netbeans instead.

PS: You will eventually get out of memory exception with default setting in Netbeans. But it can easily solved by adjust setting -Xms per application basis.

Upvotes: 4

Related Questions